WHY WE ARE IMMORAL: WAR, VIOLENCE, AND THE IGNOBLE SAVAGE WITHIN
If you wish for peace, understand war.
—B. H. Liddell Hart, Strategy, 1967
In Rob Reiner’s 1992 film A Few Good Men, Jack Nicholson’s character—battle-hardened marine Colonel Nathan R. Jessup—is being cross-examined by Tom Cruise’s naive rookie navy lawyer, Lieutenant Daniel Kaffee. Kaffee is defending two marines accused of killing a fellow soldier named Santiago at Guantanamo base in Cuba. He thinks Jessup ordered a “code red,” an off-the-books command to rough up a lazy marine trainee in need of discipline, and that matters got tragically out of hand. Kaffee wants answers to specific questions about the incident. Jessup wants to lecture him on the meaning of freedom and the need to defend it, explaining: “Son, we live in a world that has walls. And those walls have to be guarded by men with guns. I have a greater responsibility than you can possibly fathom.”
Jessup complains that he does not have the luxury to ignore the fact that Santiago’s death ultimately saved lives by generating greater discipline, and that Kaffee does not really want the truth because “deep down, in places you don’t talk about at parties, you want me on that wall. You need me on that wall.” Disgusted that he should even have to bother with such elucidations, Jessup concludes: “I have neither the time nor the inclination to explain myself to a man who rises and sleeps under the blanket of the very freedom I provide, then questions the manner in which I provide it. I’d prefer you just said thank you and went on your way.”1
The simple observation that we live in a world with walls—and have for the past 6,000 years of recorded history—implies that those walls are needed. The constitutions of states cannot completely alter the constitution of humanity. Is that constitution an evil one, and thus good walls will always be needed to make good neighbors? Or are we constitutionally good, but corrupted by evil circumstances and environments?
A philosophical conundrum that has plagued theologians and moral philosophers is known as the “problem of evil.” The Greek philosopher Epicurus, in his Aphorisms, stated it as early as 300 B.C.:
The gods can either take away evil from the world and will not, or, being willing to do so cannot; or they neither can nor will, or lastly, they are both able and willing. If they have the will to remove evil and cannot, then they are not omnipotent. If they can, but will not, then they are not benevolent. If they are neither able nor willing, then they are neither omnipotent nor benevolent. Lastly, if they are both able and willing to annihilate evil, how does it exist?2
Here is a simpler way to state the problem. The following three conditions are incompatible:
God is Omnipotent
God is Omnibenevolent
Evil Exists
If God is all-powerful, can He not prevent evil from existing? If God is all good, should He not prevent evil from existing? If evil exists, then either God is not all powerful or not all good. The eighteenth-century English poet Alexander Pope poetically phrased a solution to the problem a different way in An Essay on Man:
All Nature is but Art, unknown to thee;
All Chance, Direction, which thou canst not see;
All Discord, Harmony, not understood;
All partial Evil, universal Good:
And, spite of Pride, in erring Reason’s spite,
One truth is clear, “Whatever IS is RIGHT.”3
To explain away the problem of evil, believers often invoke the final clause in a modified version—read “God’s will” for “Whatever is.” We heard it in the wake of the December 1997 shootings of eight teenagers in a Kentucky high school, when the local minister declared that it was “God’s will” that his own son was spared a bullet. On the flip side of evil, Kenny McCaughey, father of the famed McCaughey septuplets, thanked God for the good health of the babies and their mother, and said it was “God’s will” that they have seven children: “I’m just confident the Lord’s going to handle this. He’s brought them this far and I think he’s going to carry it through.” The McCaugheys were offered a standard “selective abortion” option to reduce the number of babies and thus guarantee the health of the remaining fetuses and mother, but they declined, explaining that as God-fearing Christians they were strong abortion opponents. “That’s just the way we all feel about it. It’s going to be. If for some reason he decides to change it, that’s his will.”4
There is an obvious inconsistency here. God did not will the conception and birth of the McCaugheys’ seven children; modern medical technology did. If whatever is, is right, then Bobbi McCaughey’s infertility was also God’s will, along with His active intervention in generating the appropriate number of viable eggs in the in-vitro procedure. Why is it morally acceptable to alter God’s will of infertility through the intervention of modern medicine and technology, but not to opt for selective abortion through the same modern medical techniques? If Mrs. McCaughey had died because of complications of childbirth, would it still be God’s will? Would we not have been bombarded in the media by fertility technology Luddites for the hubris of modern medicine in trying to change what is “natural”? “Scientists should not play God,” we are told.5
As for the Kentucky high school killings, why was it God’s will to spare the minister’s son, but not the three girls who died? Was it God’s will that their lives be snuffed out as young teenagers? At the Columbine High School massacre, one of the victims, a young woman named Cassie Bernall, was gunned down after proclaiming her belief in God. A book about her life and proclamation, She Said Yes, rode the New York Times best-seller list for weeks.6 Are we to presume that this was her “reward” for having the courage to openly display her Christian faith? That hardly seems fair. And, on the other side of the story, what about all those nonbelieving high schoolers who lived? That hardly seems divinely just to believers. These events would seem to make God’s will both good and evil, or God’s power limited, or both.
In the wake of the terrorist attacks on September 11, 2001, we saw similar declarations of a divine force that intervened to spare those who survived. Most of the survivors spoke of being saved by divine grace. In fact, a more likely explanation is that the law of large numbers kicked in. In a situation in which the lives of thousands of people are endangered, any who survive do so by an extraordinary turn of events, any one of which seems so improbable as to be a miracle. Yet, as in the lottery, even when the odds of winning are astronomically stacked against any one person taking home the loot, someone is going to win. Put your life in place of that money and the stakes are elevated that much higher, and it isn’t surprising that divine intervention will be credited. But, as much as we may be willing to believe in miracles, from whence does our willingness to believe in evil arise?
The myth of pure evil is the belief that evil exists separately from individuals, or that evil exists within people as something like what we traditionally think of as an evil “force,” driving them to perform evil acts. If pure evil exists, however, then how can we hold people morally culpable for their actions? Evil is intimately linked to the problem of free will and determinism: if we do not have complete free will in our actions, how can we be held morally accountable? Further (and even more distressing), if evil does exist, then will we always be plagued by violence, war, genocide, crime, rape, and other evils?
One solution to the problem of evil is a semantic one. Evil as a descriptive adjective merely modifies something else, as in evil thoughts or evil deeds. But evil as a noun implies an existence all its own, as in an “evil force” or even an “evil person,” or “the force in nature that governs and gives rise to wickedness and sin,” or “the wicked or immoral part of someone or something,” and so on.7 In this latter sense I claim that there is no such thing as evil. There is no supernatural force operating outside the realm of the known laws of nature and human behavior that we can call evil. Calling something or someone “evil” gets us nowhere. It leads to no greater understanding. In a scientific sense it is a term ultimately indefinable. That is, there is no way to establish quantifiable criteria by which we may distinguish between something or someone that is “evil” or “not evil,” or shades of evil in between. The tendency to use the term at all comes from our Western Platonic tendency to think in terms of essences, or nonchanging “things” or “types” that are what they are by their very nature. This is one reason people have such a difficult time accepting the theory of evolution—we tend to view a species as an essence, a type, a fixed and unchanging entity. But, in fact, they do change, however slowly, and their essences are only temporary. Analogously, evil is not a fixed entity or essence. It is not a thing. Evil is a descriptive term for a range of environmental events and human behaviors that we describe and interpret as bad, wrong, awful, undesirable, or whatever appropriately descriptive adjective or synonym for evil is chosen. To call something “evil” does not lead us to a deeper understanding of the cause of evil behavior.
In this chapter, I want to focus not on evil as a metaphysical concept that exists outside the natural realm, but on evil as a physical concept that exists entirely within the natural realm as behavior. This is a shift from the supernatural to the natural. (On a larger scale, I go so far as to claim that there is no such thing as the paranormal and supernatural. There is only the normal and the natural, and mysteries we have yet to explain through them.) If there were no humans there would be no evil. Earthquakes that kill people are not, in and of themselves, evil. A shift between two tectonic plates that causes the earth to make a sudden and dramatic movement cannot possibly be considered evil outside the effects such an earthquake might have on the humans living near the fault line. It is the effects of the earthquake on our fellow humans that we judge to be evil. Evil as a physical concept requires human evaluation of a behavior and its effects on humans. As such, bacterial diseases cannot be inherently evil. By causing humans to sneeze, cough, vomit, and have diarrhea, bacteria are highly successful organisms, spreading themselves far and wide. As their human hosts, we may label the effects of a disease as evil, but the disease itself has no moral existence. Good and evil are human constructs. Which is not to say that a person is not morally responsible for his or her choices and their effects.
One morning in 1995 I had breakfast with Thomas Keneally, author of Schindler’s List. Out of curiosity I asked him what he thought was the difference between Oskar Schindler, rescuer of Jews and hero of his story, and Amon Goeth, the antihero Nazi commandant of the Plaszow concentration camp. His answer was revealing. Not much, he said. Had there been no war, Schindler and Goeth might have been drinking buddies and business partners, morally questionable at times, perhaps, but relatively harmless and ineffectual as historical personages. What a difference a war makes.
This question, on a larger scale, is what spurred the debate in 1996 over the publication of Daniel Goldhagen’s book Hitler’s Willing Executioners. Goldhagen’s thesis is that ordinary Germans participated in the mass murder of Jews, that anti-Semitism was pervasive and nearly exclusively German, and that we cannot blame a handful of extremists in the Nazi party for the Holocaust—all Germans share the blame. “My explanation … is that the perpetrators, ‘ordinary Germans,’ … having consulted their own convictions and morality and having judged the mass annihilation of Jews to be right, did not want to say ‘no.’”8 The problem with this thesis is that it does not explain the exceptions—the Germans who helped Jews and the non-Germans who participated in the Holocaust. Max Frankel, in assessing the Goldhagen thesis, recalls how his Jewish mother escaped Nazi Germany thanks to the help of no less than a Gestapo police chief who, after giving her the name of the Gestapo contact who would aid in her escape, told her, in reference to her goal of reaching America, “If you get there, will you tell them we’re not all bad?”9
It’s true, not all Germans—not even all Nazis—were bad. Likewise, and extrapolating to the larger issue on the table, humans are neither all bad nor all good. Most of us most of the time are good in most situations. Under extreme circumstances, however, the flexibility of our behavior may push us in the other direction. The Holocaust was not the product of “ordinary” Germans, but of Nazi Germans in extraordinary circumstances and conditions. A narrow focus on the proximate causes of the Holocaust misses the ultimate, evolutionary lesson that this tragic event in human history can teach us about the malleable condition of human nature.
Humans evolved to be moral animals, but by no means always moral. There are times when we are amoral, and even immoral. We have the potential for all three, and like any human trait, the degree of expression of the quality varies between individuals. Some people, for whatever reason, are more moral than others. A number of historical contingencies (and who knows what else in his genes and environment) drove Oskar Schindler to follow a completely different path from Amon Goeth, even though he could just as easily have gone the other way. From there the cascading consequences of their decisions took them down their alternately chosen tracks; the road not taken makes all the difference.
This is not mere just-so storytelling. Yale University social psychologist Stanley Milgram observed this range of moral flexibility in his famous “shock” experiments in the 1960s, conducted in the wake of the Adolf Eichmann trial in an attempt to make a scientific study of the banality of evil. How was it possible for educated, intelligent, and cultured human beings to commit mass murder? What environmental conditions would override our evolutionary propensity toward moral behavior and the repulsion most of us would (or at least should) feel in causing or witnessing the pain of another human? Milgram presented his subjects with a “learning” experiment that was purportedly to test the possible effect of punishment on memory. The subject would read a list of paired words to the “learner,” then present the first word of each pair again, upon which the learner had to recall the second word. If the learner was wrong the teacher—the real subject in this experiment—was to deliver an electric shock. No one was really shocked of course (the learners were shills working for Milgram who purposely gave them wrong answers), but the subjects believed the shocks were real. Sitting in front of the subject was a panel of toggle switches that read: Slight Shock, Moderate Shock, Strong Shock, Very Strong Shock, Intense Shock, Extreme Intensity Shock, DANGER: Severe Shock, XXXX (this final category was labeled at 450 volts). The results were, well, shocking: 65 percent administered the strongest shock possible—XXXX—and 100 percent administered at least a Strong Shock of 135 volts (figure 8).
Figure 8. A Shocking Experiment on Obedience to Authority
In a quest to understand how highly educated and richly cultured Germans could be turned into Nazi mass murderers, psychologist Stanley Milgram undertook the study of obedience to authority. Here subjects, playing the role of “teachers,” were told by a scientist/authority figure to shock “learners” who made mistakes on a test. Sitting in front of the subject was a panel of toggle switches that read: Slight Shock, Moderate Shock, Strong Shock, Very Strong Shock, Intense Shock, Extreme Intensity Shock, DANGER: Severe Shock, XXXX (450 volts). Milgram discovered that 65 percent administered the strongest shock possible—XXXX—and 100 percent administered at least a Strong Shock of 135 volts. (From Stanley Milgram, Obedience, 1965. Courtesy of Penn State Media Sales)
Milgram varied the conditions in order to control for other variables that might influence the outcome of the experiment. The physical proximity of the victim, for example, had an effect on how far the teachers would go in shocking the learners—closer, less shock; farther away, more shock. Group pressure was also a factor—when two other “teachers” (Milgram’s confederates) encouraged the subject to continue shocking the learner, they did so with impunity; but when the confederates refused to shock their own learners, the subject tended to refuse as well. In other words, Milgram discovered that moral behavior is extremely malleable. He expressed his own and others’ amazement at what these experiments revealed about our moral natures: “What is surprising is how far ordinary individuals will go in complying with the experimenter’s instructions. Despite the fact that many subjects experience stress, despite the fact that many protest to the experimenter, a substantial proportion continue to the last shock on the generator.” Why? Because, Milgram continued, “It is psychologically easy to ignore responsibility when one is only an intermediate link in a chain of evil action but is far from the final consequences of the action.” Even Eichmann, he noted, was repulsed when he first witnessed a camp killing, but to implement the Holocaust he had only to push papers across a desk. The SS guards who actually did the deed justified their behavior by claiming they were only following orders. “I am forever astonished when lecturing on the obedience experiments in colleges across the country,” Milgram concluded. “I faced young men who were aghast at the behavior of experimental subjects and proclaimed they would never behave in such a way, but who in a matter of months, were brought into the military and performed without compunction actions that made shocking the victim seem pallid. In this respect, they are no better and no worse than human beings of any other era who lend themselves to the purposes of authority and become instruments in its destructive processes.”10
Depending on the circumstances, perhaps any of us could become Nazis. Who is to say otherwise? Raised in a free, democratic society like America, how do any of us know how we might react in a totalitarian regime like Nazi Germany, in a time of war and under pressure to obey one’s superior (what Milgram termed “obedience to authority”)?
Stanford University social psychologist Philip Zimbardo tested this hypothesis in the “Stanford County Jail,” a mock prison set up in the basement of his psychology building where randomly chosen students were assigned to be “prisoners” or “guards.” To make it as real as possible, Zimbardo gave the guards sunglasses, a whistle, a club, and cell keys. Prisoners were arrested, sprayed for lice, forced to stand naked during orientation, given bland uniforms, and stuck in dreary six-by-nine-foot cells. What unfolded in a matter of days was disturbing. These psychologically normal undergraduate American students were transformed into the role of either violent, authoritative guards or demoralized, impassive prisoners. The experiment was to last for two weeks. Zimbardo was forced to call it off after six days, not just because he feared that the lives of his students would be permanently transformed by this Lord of the Flies experience, but because of what he discovered about the reality of the human moral condition: “I called off the experiment not because of the horror I saw out there in the prison yard, but because of the horror of realizing that I could have easily traded places with the most brutal guard or become the weakest prisoner full of hatred at being so powerless that I could not eat, sleep, or go to the toilet without permission of authorities.”11
How realistic are these experiments conducted in American universities in the 1960s? The real-world example of Nazi Germany, in some sense, serves as a historical experiment. At the Nuremberg Trials following the war, psychologist G. M. Gilbert was assigned to study the men imprisoned for committing these “crimes against humanity.” He discovered that not only were the perpetrators well-cultured and highly educated, they tested out two to three standard deviations above average on a standardized intelligence test used at the time, the Wechsler Adult Intelligence Scale. Where the average IQ is 100, Reichs-Commissioner Seyss-Inquart tested at 141, Reichsmarschall Hermann Goring at 138, Reich Chancellor Franz Von Papen at 134, Poland Governor-General Dr. Hans Frank at 130, Foreign Minister Joachim von Ribbentrop at 129, Hitler’s architect and Reichsminister of Armaments Albert Speer at 128. The prison psychiatrist Douglas Kelley, after his evaluations, offered this observation about the moral character of the leading Nazis:
As far as the leaders go, the Hitlers and the Görings, the Goebbels and all the rest of them were not special types. Their personality patterns indicate that, while they are not socially desirable individuals, their like could very easily be found in America. Neurotic individuals like Adolf Hitler, suffering from hysterical disorders and obsessive complaints, can be found in any psychiatric clinic. And there are countless hundreds of similar ones, thwarted, discouraged, determined to do great deeds, roaming the streets of any American city at this very moment. No, the Nazi leaders were not spectacular types, not personalities such as appear only once in a century. They simply had three quite unremarkable characteristics in common—and the opportunity to seize power. These three characteristics were: overweening ambition, low ethical standards, and a strongly developed nationalism which justified anything done in the name of Germandom.12
Hitler aside, arguably the most morally corrupt Nazi for the duration of the twelve-year Reich was SS chief Heinrich Himmler, as responsible as anyone for the horrors of the concentration and extermination camps and for the organization and implementation of the Final Solution. Yet even in the case of Himmler we can see how he might have gone down a different moral path, under dissimilar conditions. The renowned Holocaust historian and Himmler biographer Richard Breitman concluded that “The mass murders, the brutality, the sadism— those were not what was unique about the Nazis. The brutal murder of whole populations, including children, has been with us since the beginning of recorded history and most probably before that.” In fact, Breitman believes, if we cannot explain Himmler then we cannot explain most of human history. “We can put ourselves in the shoes of the perpetrators, as well as the shoes of the victims, because we all have in ourselves the potential for extreme good and extreme evil—at least, what we call good and evil. The real horror of Himmler is not that he was unusual or unique but that he was in many ways quite ordinary, and that he could have lived out his life as a chicken farmer, a good neighbor with perhaps somewhat antiquated ideas about people.”13
From an evolutionary perspective this makes sense. Individuals in our ancestral environment needed to be both cooperative and competitive, depending on the context and desired outcomes. Cooperation may lead to more successful hunts, food sharing, and group protection from predators and enemies. Competition may lead to more resources for oneself and family, and protection from other competitive individuals who are less inclined to cooperate, especially those from other groups. Social psychologists have adequately demonstrated that moral behavior is tractable and that there is a range of potential for the expression of moral or immoral behavior.14 We evolved to be moral, but have the capacity to be immoral some of the time in some circumstances with some people. Which direction any one of us takes in any given situation will depend on a complex array of variables.
An asymmetry in our moral observations about what people are really like comes from the fact that we have a tendency to focus on extreme acts of immorality and ignore the fact that most of the time, most people are gracious, considerate, and benevolent. For every act of violence or deception that appears on the nightly news, there are 10,000 acts of kindness that go largely unnoticed. In fact, violence and deception make the news precisely because they are so unusual in our daily experience. In the U.S. population of 280 million people, acts of cruelty happen daily—the law of large numbers says, in fact, that million-to-one odds happen 280 times a day. But how many of us have seen even one murder, carjacking, or kidnapping? Far more of us have probably been victims of some sort of scam or another, but this is because truth telling is the basis of almost all human interactions. The con artist can only be successful because most of us are honest most of the time. And once burned, twice shy. We learn from our mistakes. Shading and nuance, as we have seen in fuzzy logic, are at the foundation of the study of human behavior.
In the 1961 Israeli trial of Adolf Eichmann, one of the chief orchestrators of the Nazi “Final Solution to the Jewish Question,” Hannah Arendt, covering the trial for the New Yorker, penned a phrase that has become infamous in the lexicon of social commentary: “the banality of evil.” Expecting to see the raw viciousness of evil in the face of Eichmann—seated in a bulletproof glass box like a caged predatory beast—Arendt instead gazed upon a sad and pathetic-looking man who recounted in cold language and with dry statistics the collection, transportation, selection, and extermination of millions of human beings. Most surprising of all, Eichmann appeared to be a relatively normal human being—not a monster, not mentally deranged, not so different from many paper-pushing bureaucrats who go about their daily tasks like automata.
Indeed, a glance at Eichmann’s working life shows a person who could share a smoke and a brandy with colleagues after a hard day at the office. Consider Eichmann’s description at the end of the infamous Wannsee Conference held on January 20, 1942, to plan for the “Endlösung der Judenfrage”:
I remember that at the end of this Wannsee conference, Heydrich, Müller, and my humble self settled down comfortably by the fireplace, and that then for the first time I saw Heydrich smoke a cigar or a cigarette, and I was thinking: today Heydrich is smoking, something I have not seen before. And he drinks cognac—since I had not seen Heydrich take any alcoholic drink in years. After this Wannsee Conference we were sitting together peacefully, and not in order to talk shop, but in order to relax after the long hours of strain.15
What “evil” describes here is the banality of Heydrich and Eichmann’s bureaucratic duties, which included the processing not just of paper but also of people. This is what I call the “evil of banality.”16
Since 1945, in fact, the ultimate test of any moral theory is Hitler and the Holocaust. If ever there were an embodiment of evil behavior, it is surely Adolf Hitler, and if ever there were an act that should be labeled evil, it is surely the Holocaust. The images in figure 9 of Adolf Hitler alongside his SS henchman Heinrich Himmler, and the photograph of burning bodies in an open pit outside Crematoria 5 at Auschwitz-Birkenau, surely represent nothing if not an evil act.
Yet even here I would go so far as to say that calling Adolf Hitler evil moves us no closer to an understanding of the causes of what he did. What he did may be worse than almost anything anyone ever did to anyone else in history, but it is all still within the realm of human possibilities. The Holocaust may be the supreme act of inhumanity (indeed, the Nuremberg Trials established the legal precedence of convicting individuals for their acts of inhumanity), but we must always keep in mind that these inhuman acts were committed by humans, inhuman acts within our behavioral repertoire. Explaining the Holocaust, in fact, is intimately linked with explaining Hitler, both of which have become something of a scholarly and publishing industry. “The shapes we project onto the inky Rorschach of Hitler’s psyche are often cultural self-portraits in the negative. What we talk about when we talk about Hitler is also who we are and who we are not,” author Ron Rosenbaum writes.17
The explanations for Hitler, and by inference for the Holocaust (as in Milton Himmelfarb’s catchy idiom, “Nc Hitler, No Holocaust”), have ranged from the ridiculous (Hitler’s grandfather was Jewish) and the absurd (the “one ball” theory that Hitler had only one testicle) to the metaphysical (Hitler was evil). Some insist the explanation has been found (John Lukacs places the crystallization of Hitler’s anti-Semitic personality as early as 1919), that it can be but has not yet been found (Yehuda Bauer: “Hitler is explicable in principle, but that does not mean that he has been explained”), that it cannot be found (Emil Fackenheim: “The closer one gets to explicability the more one realizes nothing can make Hitler explicable”), or that it should not be found (Claude Lanzmann: “There is even a book written … about Hitler’s childhood, an attempt at explanation which is for me obscenity as such”). The Hitler of the Holocaust ranges wildly between intentional and functional evil. Lucy Dawidowicz’s Hitler is a sole conductor who orchestrated the Holocaust with evil intent, deciding on his war against the Jews as early as November 1918 while still in the military hospital recovering from a gas attack. By contrast, Christopher Browning’s Hitler stumbles his way hesitatingly into the Holocaust, with “a sense that in the end he was scared of what he was doing. Now I interpret that as he didn’t think it was wrong, but he was aware that he was now doing something that had never been done before.”18
Figure 9. Hitler, Himmler, and the Holocaust as the Embodiment of Pure Evil
Heinrich Himmler and Adolf Hitler, architects of the Final Solution. (Photograph by Estelle Bechhoefer. Courtesy of U.S. Holocaust Memorial Museum)
A secret photograph of the burning of bodies in an open pit after gassing. When the crematoria were not working, or there were too many bodies for the capacity of the crematoria, the Nazis resorted to burning bodies in large communal ditches. (Courtesy of Yad Vashem, Jerusalem, Israel)
The problem scholars and historians have had in explaining Hitler and the Holocaust is the same one that plagues explanations of evil—that is the myth of pure evil. As Rosenbaum opines: “The search for Hitler has apprehended not one coherent, consensus image of Hitler but rather many different Hitlers, competing Hitlers, conflicting embodiments of competing visions, Hitlers who might not recognize each other well enough to say ‘Heil’ if they came face to face in Hell.”19 If Hitler can escape explanation in this sense, can the Holocaust? What about evil itself? We agree on the basic facts about the Holocaust, but interpretations about why it happened and what it means quickly become entangled in contradictory premises about human history and human nature. For Claude Lanzmann, the Holocaust “is a product of the whole story of the Western world since the very beginning.”20 But what does this tell us? If everything is the cause, then nothing is the cause.
These are not explanations as such. They are more like opinion editorials. A scientific approach to explaining evil can be found in social psychologist Roy Baumeister’s thoroughly researched treatise on evil. Baumeister demonstrates that although for most people killing one human being is repulsive, killing millions can become routine: “The essential shock of banality is the disproportion between the person and the crime. The mind reels with the enormity of what this person has done, and so the mind expects to reel with the force of the perpetrator’s presence and personality. When it does not, it is surprised.”21 The explanation for the surprise can be found by contrasting the victim’s perspective with that of the perpetrator. For example, Maximillian Grabner, head of the Political Department at Auschwitz and associate of the camp commandant Rudolph Höss, explained the crime of the Holocaust from the perpetrator’s perspective: “I only took part in this crime because there was nothing I could do to change anything. The blame for this crime lay with National Socialism. I myself was never a National Socialist. Nevertheless, I still had to join the party … . I only took part in the murder … out of consideration for my family. I was never an anti-Semite and would still claim today that every person has the right to live.”22 This is the evil of banality in its purest state.
In taking a broader perspective on the perpetrator’s evil, Baumeister targets specific bad acts for scientific examination, including wife beatings, gang violence, drive-by shootings, rape, and other examples of what he calls the “breakdown of self-control.” In an interesting twist on how most of us think about evil and violence, Baumeister suggests that “you do not have to give people reasons to be violent, because they already have plenty of reasons. All you have to do is take away their reasons to restrain themselves.”23 Most of us restrain ourselves most of the time, but there are circumstances when any of us has the potential to express extreme anger and violence. Are we all, then, evil—or at least potentially evil? No, not quite, We all have the potential to behave in ways that others might consider to be bad, cruel, mean, or violent. Baumeister makes this point in exploring seven individual myths about evil:
1. Evil is always intentional (from the victims’ perspective; perpetrators always have a justification).
2. Evil is motivated by pleasure.
3. The victim of evil is innocent and good.
4. Evil is conducted by people completely different from us, wholly other.
5. Evil is the original sin, built into our natures.
6. Evil is the opposite not only of good but of order, peace, and stability.
7. Evil people are selfish egotists driven to improve their self-esteem by evil acts.24
It is important to note that in no way does debunking the myth of pure evil ignore the fact that human behaviors range broadly, or that in some cases a person may have serious mental problems or fall well away from the mean of normal human behavior toward genuinely wicked or sadistic actions. But while some may call Adolf Hitler a madman or a psychopath, I doubt that anyone would allow him to plea “not guilty by reason of insanity” for his crimes against humanity.
A deeper problem caused by the myth of pure evil, says Baumeister, is that it “conceals the reciprocal causality of violence.” That is, as in divorces and most other human interactions involving conflict and resolution, there are two sides to almost every story. It turns out, for example, that research on perpetrators shows that they have, in their minds anyway, perfectly legitimate reasons for the violence. Ironically, Baumeister concludes, the myth of evil itself may lead to greater violence: “The myth encourages people to believe that they are good and will remain good no matter what, even if they perpetrate severe harm on their opponents. Thus, the myth of pure evil confers a kind of moral immunity on people who believe in it … . Belief in the myth is itself one recipe for evil, because it allows people to justify violent and oppressive actions. It allows evil to masquerade as good.”25
September 11, 2001, comes to mind here. United States President George W. Bush described what happened that day as an act of pure evil. Yet millions of people around the world celebrate that day as a triumphant victory over what they perceive to be an evil American culture. What we are witnessing here is not a conceptual difference in understanding the true nature of evil. Nor is it simply a matter of who is in the right. It is, at least on one important level, a difference of perspective. To achieve true understanding and enlightenment it might help to understand what the other side was thinking. In a less emotionally charged example, if you lived in seventeenth-century Europe and you really believed that torturing religious heretics and burning women as witches would save their souls and restore peace to your community, then from that perspective the Spanish Inquisition and the European witch craze were supreme acts of morality. Similarly—and it seems almost blasphemous to suggest it—if you lived in the twenty-first century Muslim Middle East and you truly believed that killing American citizens was God’s will to save your people and restore peace to your community, then from that perspective what could serve as a more visually striking statement than bringing down the twin symbols of your enemy? If there is a moral module in the brain (and I suspect there is something that at least corresponds to the concept of such a module in the brain, even if it is splayed out over a large portion of the cortex or consists of lots of smaller modules interconnected), then I have little doubt that Osama bin Laden and Muhammad Ata’s moral modules were fully lit up on September 11.
This is not to argue that morality is reduced to one’s perspective, or that events like the Holocaust do not represent an act of evil (in its adjectival form). But if we are to understand why the Holocaust happened, we must scientifically investigate the reasons behind such acts.
If we are not going to talk about evil as a metaphysical entity, then how shall we talk about it? One answer is to study evil as a scientist would, beginning with proper descriptive terms and employing fuzzy logic.26 In fuzzy logic, shades of gray rule the universe, despite our heroic efforts over at least the past two and a half thousand years to dichotomize the world into Platonic categories. Aristotle said A is A, and that binary logic dictates a single overarching law: A or not-A. Either something is A or it is not A. It cannot be both. The sky is blue or it is not blue. It cannot be both blue and not blue. But what color is the sky at sunrise and sunset? What color is it overhead versus on the horizon? What color is it at 150,000 feet? And how is blue defined? What shade of blue was chosen as the defining essence of “blueness”? In point of fact, in the real world something can be A and not A. The sky can be blue and not blue. Thinking like a scientist, in statistical terms, we can assign a probability to the blueness of the sky. The sky is fuzzy blue. Directly overhead we might call it .9 blue. At a forty-five-degree angle from directly overhead we might assign it a fuzziness of .7 blue. On the horizon it might be a .5 blue.27
Fuzzy logic also allows for subtle nuances and shades of gray in the real-life complexities of ethical dilemmas.28 The social psychologist Carol Tavris provides an apt example of fuzzy thinking in the moral realm in a 1998 Los Angeles Times op-ed on the Clinton morality debacle, with a title question that answers itself—“All Bad or All Good? Neither.” Tavris explains that the scientific evidence points to the fact that “people’s reasoning about moral dilemmas, like their moral behavior itself, is specific to given situations.”29 The false choice of either all bad or all good does not depict the subtleties and nuances of human behavior.
Consider Lawrence Kohlberg’s famous stage theory of moral reasoning, for example, which suggests that, as people grow up, they pass through largely fixed rungs on a moral ladder:
1. Obedience and Punishment
2. Individualism, Instrumentalism, and Exchange
6. Principled Conscience30
Morality, says Kohlberg, develops within an individual, beginning with parental fear of punishment, moving to selfish hedonism, changing to conformity and loyalty to peers, developing into social law and order, rising to social contract reasoning, and finally reaching the highest rung of Gandhiesque moral principles. Research subsequent to Kohlberg’s shows, however, that these stages are not as fixed and universal as they once appeared. One anthropologist, for example, found that these stages do not always apply to people in non-Western cultures. Moral development varies widely across the globe.31 Two extensive studies on Kohlberg’s theory by psychologists of religion found that specific cultural and religious values held by an individual influence where he or she may fall in the stage sequence.32 In other words, the stages are not fixed developmental sequences so much as they are values that are context dependent. One study discovered that people make a distinction between moral and religious values in the various stages (since not all values in religion are related to morality), and another study reported that when the individual’s religion emphasized principle-based moral decisions, they were more inclined to reach the “highest” rung of the moral ladder.33 One psychologist even found a slight negative correlation between Kohlberg’s stages and both intrinsic and extrinsic religiosity, meaning that whether one is motivated to be moral by intrinsic principles or by extrinsic rewards of one’s religion depends on the context and circumstances of the moral issue; the stages of moral development were irrelevant.34 And social psychologist Carol Gilligan noted that women’s moral development differs from that of men. Women tend to emphasize care and responsibility in moral choices, whereas Kohlberg emphasized male “justice orientation” in his research.35
Humans can be morally principled in one circumstance, hedonistic in another, fear punishment in one context, exert our loyalty to friends in a different context. Referencing Clinton but generalizing to us all, Carol Tavris concludes that “the assumption that a moral failing in one domain reveals something profoundly important about a person’s entire character, or predicts his or her behavior in other situations, is wrong.”36 One bad act does not an immoral person make. Perhaps this is what Jesus meant when he defied the Old Testament law that required the death penalty for adultery, by challenging a woman’s accusers: “He that is without sin among you, let him first cast a stone at her.” This may also represent the fundamental difference between Old Testament and New Testament morality: inflexible moral principles versus contextual moral guidelines—a stricter, draconian God versus a kinder, gentler God. As we shall see in the second half of this book, a fuzzy provisional moral system is another step in the contextualization of moral rules and behavior, where moral principles are ranked in terms of their fuzzy values, which can change under changing circumstances, yet still retain their core meaning. A case study on how fuzzy logic can be applied to the study of the origins of our moral and immoral nature can be found in the Ya
nomamö peoples of Amazonia, variously described by their ethnographers as either erotic or fierce.
On July 7,1947, a spacecraft crash-landed near Roswell, New Mexico. Aboard was an alien anthropologist sent here from an advanced civilization to study the newly emerging intelligence that calls itself “wise man.” Whether or not that self-assessment is accurate was of great interest to the Galactic Federation, for this species had just achieved mastery of atomic fission, and thus could be a potential threat to the galactic peace. The Galactic Anthropological Association wanted to know if this formerly primitive people was basically “erotic” and thus there would be no need to worry about them, or if they were inherently “fierce,” in which case further monitoring and missionary reeducation might be required.
From the subject’s (our) perspective the fundamental flaw in the inquiry is that humans are not so easily pigeonholed into such clear-cut categories as “fierce” or “erotic.” We are both (and a lot more), the nature and intensity of our behavior being dependent upon a host of biological, social, and historical variables. If an alien anthropologist had crashed in Europe in 1943, our intrepid observer would surely have called us the “fierce” people. But if, say, the landing cite was Woodstock, New York, or San Francisco, California, in 1968, ET would likely have labeled us the “erotic” people. Local and historical context matters, and any description based on an isolated context is grossly oversimplified and hopelessly incomplete. This is why we need anthropologists, the scientific observers of our planet’s rich diversity of people and cultures.
There is a maxim anthropologists often cite about the geopolitics of diplomacy and warfare among indigenous peoples: The enemy of my enemy is my friend. In reality, of course, the maxim applies to virtually all groups, from bands to tribes to chiefdoms to states—recall the temporary friendship between the United States and the USSR from 1941 to 1945 that promptly dissolved into the cold war upon Germany’s defeat. I thought of this maxim when I interviewed journalist Patrick Tierney about his book Darkness in El Dorado: How Scientists and Journalists Devastated the Amazon. Tierney had recently been pummeled by a panel of experts in front of a thousand scientists gathered at the annual meeting of the American Anthropological Association. At the heart of Darkness in El Dorado is the question of whether humans are by nature erotic, fierce, or both. Among the many scientists whom Tierney attacks is the anthropologist Napoleon Chagnon, whose study of the Ya
nomamö people of Amazonia is arguably the most famous ethnography since Margaret Mead’s Samoan classic, Coming of Age in Samoa. Since Chagnon has a reputation as an intellectual pugilist who had accumulated a score of enemies over the decades, I fully expected that the scientists would rally around Tierney in a provisional alliance. With a couple of minor exceptions, however, there was almost universal condemnation of the book. A British science writer who witnessed the verbal thumping said: “If I had taken such a beating as Tierney I would have crawled out of the room and cut my throat.”37
Humans are storytelling animals. Thus, following Darwin’s Dictum that “all observation must be for or against some view if it is to be of any service,” we can recognize that Tierney’s story argues against the view that he believes has been put forth by certain anthropologists about the Ya
nomamö and, by implication, about all humanity. Chagnon, he points out, subtitled his best-selling ethnographic monograph on the Ya
nomamö The Fierce People. Tierney spares no ink in presenting a picture of Chagnon as a fierce anthropologist who sees in the Ya
nomamö nothing more than a reflection of himself (figure 10). Chagnon’s sociobiological theories of the most violent and aggressive males winning the most copulations and thus passing on their genes for “fierceness,” says Tierney, is a Rorschachian window into Chagnon’s own libidinous impulses.
Figure 10. Napoleon Chagnon: The Man Who Called the Ya
nomamö “Fierce”
Anthropologist Napoleon Chagnon accompanies two Ya
nomamö men on a 1995 field study. (Courtesy of Napoleon Chagnon)
Tierney’s strongest case may be against the French anthropologist Jacques Lizot, who calls the Ya
nomamö “the erotic people.”38 Lizot, Tierney claims, engaged in homosexual activities for years with so many Ya
nomamö young men, and so frequently, that he became known in Ya
nomamöspeak as bosinawarewa, which translates politely as “ass handler” and not so politely as “anus devourer.”39 In response to these claims not only did Lizot not deny the basic charges (that also included exchanging goods for sex), but he admitted to Time magazine: “I am a homosexual, but my house is not a brothel. I gave gifts because it is part of the Ya
nomamö culture. I was single. Is it forbidden to have sexual relations with consenting adults?”40 No, but Tierney disputes both the age of Lizot’s partners and whether or not they consented, and suggests that even if it were both legal and moral this is hardly the standard of objectivity one might have hoped for in scientific research, and that it is Lizot who best deserves the descriptive adjective “erotic.”
So which is it? Are the Ya
nomamö fierce or erotic, or are these descriptive terms for their anthropological observers? Carping over minutiae in Chagnon’s research methods and ethics has dogged him throughout his career, but it is secondary to a deeper, underlying issue in the anthropology wars. What Chagnon is really being accused of is biological determinism. To postmodernists and cultural determinists, calling the Ya
nomamö “fierce” and explaining their fierceness through a Darwinian model of competition and sexual selection indicts all of humanity as innately evil and condemns us to a future of ineradicable violence, rape, and war. Are we really this bad? Are the Ya
nomamö?41
The Ya
nomamö skirmish is only the latest in a long line of battles that have erupted in the century-long anthropology wars. The reason such controversies draw so much public attention is that what’s at stake is nothing less than the true nature of human nature and how that nature can most profitably be studied.
Anthropologist Derek Freeman’s lifelong battle with the legacy of Margaret Mead, for example, was not really about whether Samoan girls are promiscuous or prudish. Mead’s philosophy, inherited from her mentor, Franz Boas, that human nature is primarily shaped by the environment, was supported by her “discovery” that Samoan girls are promiscuous, whereas in other cultures promiscuity is taboo. Freeman argues that Mead was duped by a couple of Samoan hoaxers, and had she been more rigorous and quantitative in her research she would have discovered this fact before going to press with what became the all-time anthropological best-seller—Coming of Age in Samoa. According to Freeman, Mead’s ideology trumped her science, and anthropology lost.42 His 1983 book, Margaret Mead and Samoa: The Making and Unmaking of an Anthropological Myth, triggered a paroxysm within the anthropological community, as he recalled:
The 1983 annual meeting of the American Anthropological Association in Chicago included a special session dedicated to my book, but strangely I was not invited to defend myself. Now I know why. One eyewitness described it as “a sort of grotesque feeding frenzy,” while another told me “I felt I was in a room with people ready to lynch you.” At the annual business meeting later that day, a motion denouncing my refutation as “unscientific” was moved, put to the vote, and passed. In the December, 1983 issue of the American Anthropologist, no fewer than five different critiques of my book were published, but I was denied the usual right of simultaneous reply. My rejoinder, when it did appear, some six months later, was limited to one tenth of the space that had been given to my critics.43
Anthropology is a sublime science because it deals with such profoundly deep questions as the nature of human nature. But to even ask such questions as “Are we by nature good or evil?” overlooks the complexity of human affairs. The failure of Darkness in El Dorado has less to do with getting the story straight and more to do with a fundamental misunderstanding of the plasticity and diversity of human behavior. Tellingly, the fourth edition of Chagnon’s classic work Ya
nomamö dropped the subtitle The Fierce People. Had Chagnon determined that the Ya
nomamö were not “the fierce people” after all? No. He realized that too many people were unable to move past the moniker to grasp the complex and subtle variations contained in all human populations, and he became concerned that they “might get the impression that being ‘fierce’ is incompatible with having other sentiments or personal characteristics like compassion, fairness, valor, etc.”44 In fact, the Ya
nomamö call themselves waiteri (fierce), and Chagnon’s attribution of them as such was merely attempting “to represent valor, honor, and independence” that the Ya
nomamö saw in themselves. As he notes in his opening chapter, the Ya
nomamö “are simultaneously peacemakers and valiant warriors.” Like all people, the Ya
nomamö have a deep repertoire of responses for varying social interactions and differing contexts, even those that are potentially violent: “They have a series of graded forms of violence that ranges from chest-pounding and club-fighting duels to out-and-out shooting to kill. This gives them a good deal of flexibility in settling disputes without immediate resort to lethal violence.”45
Chagnon has often been accused of using the Ya
nomamö to support a sociobiological model of an aggressive human nature. Even here, returning to the primary sources in question shows that Chagnon’s deductions from the data are not so crude, as when he notes that the Ya
nomamö’s northern neighbors, the Ye’Kwana Indians—in contrast to the Ya
nomamö’s initial reaction to him—“were very pleasant and charming, all of them anxious to help me and honor bound to show any visitor the numerous courtesies of their system of etiquette,” and therefore that it “remains true that there are enormous differences between whole peoples.”46 Even on the final page of his chapter on Ya
nomamö warfare, Chagnon inquires about “the likelihood that people, throughout history, have based their political relationships with other groups on predatory versus religious or altruistic strategies and the cost-benefit dimensions of what the response should be if they do one or the other.” He concludes: “We have the evolved capacity to adopt either strategy.”47
As an example of this moral plasticity, Chagnon summarized the data from his now-famous Science article revealing the positive correlation between levels of violence among Ya
nomamö men and their corresponding number of wives and offspring. “Here are the ‘Satanic Verses’ that I committed in anthropology,” Chagnon joked, as he reviewed his data:
I didn’t intend for this correlation to pop out, but when I discovered it, it did not surprise me. If you take men who are in the same age category and divide them by those who have killed other men (unokais) and those who have not killed other men (non-unokais), in every age category unokais had more offspring. In fact, unokais averaged 4.91 children versus 1.59 for non-unokais. The reason is clear in the data on the number of wives: unokais averaged 1.63 wives versus 0.63 for non-unokais. This was an unacceptable finding for those who hold the ideal view of the Noble savage. ‘Here’s Chagnon saying that war has something good in it.’ I never said any such thing. I merely pointed out that in the Ya
nomamö society, just like in our own and other societies, people who are successful and good warriors, who defend the folks back home, are showered with praise and rewards. In our own culture, for example, draft dodgers are considered a shame. Being a successful warrior has social rewards in all cultures. The Ya
nomamö warriors do not get medals and media. They get more wives.48
Despite the mountains of data Chagnon has accumulated on Ya
nomamö aggression, he is careful to note the many other behaviors and emotions expressed by the Ya
nomamö: “When I called the Ya
nomamö the ‘fierce people,’ I did not mean they were fierce all the time. Their family life is very tranquil. Even though they have high mortality rates due to violence and aggression and competition is very high, they are not sweating fiercely, eating fiercely, belching fiercely, etc. They do kiss their kids and are quite pleasant people.”49
In contrast to Chagnon’s depiction of the Ya
nomamö as “fierce,” many commentators (such as Tierney) hold up anthropologist Ken Good’s book, Into the Heart: One Man’s Pursuit of Love and Knowledge Among the Y
nomami, to argue the case that they are “erotic.” Into the Heart is a page-turner because the very features of Ya
nomamö culture that Chagnon’s critics claim he overemphasizes are, in fact, present in spades in every chapter of Good’s gripping tale. As Chagnon’s graduate student, Good immersed himself in Ya
nomamöland, but in time found himself falling in love with a beautiful young Ya
nomamö girl named Yarima.50 As the years passed and he was occasionally forced to leave Ya
nomamöland (to renew his permit or attend conferences or work on his doctoral dissertation), he became emotionally distraught over leaving Yarima alone. Why? When Yarima came of age (defined in her culture as first menses), she and Good began living together and consummated their “marriage” (Ya
nomamö do not have a marriage ceremony per se; instead a couple, usually the man, declares that they are married and the two begin living together). Good’s problem was that he was all too aware of the very human nature of Ya
nomamö men. “They will grab a woman while she is out gathering and rape her. They don’t consider it a crime or a horrendously antisocial thing to do. It is simply what happens. It’s standard behavior. In such a small, enclosed community this (together with affairs) is the only way unmarried men have of getting sex.”51
Good’s worries were justified and the universal emotion of jealousy was no less intense in this highly civilized, educated man than it was in any of the people he was studying to earn his Ph.D. In short, Good was on an emotional roller coaster from which he could not extricate himself.
I felt the tension, and I tried to deal with it. I wanted to think that Yarima would be faithful to me. But I knew the limits of any woman’s faithfulness here. Fidelity in Y
nomami land is not considered a standard of any sort, let alone a moral principle. Here it is every man for himself. Stealing, rape, even killing—these acts aren’t measured by some moral standard. They aren’t thought of in terms of proper or improper social behavior. Here everyone does what he can and everyone defends his own rights. A man gets up and screams and berates someone for stealing plantains from his section of the garden, then he’ll go and do exactly the same thing. I protect myself, you protect yourself. You try something and I catch you, I’ll stop you.52
Many antisocial behaviors, such as theft, are kept at a minimum through such social constraints as shunning or such personal constraints as fear of violent retaliation. But, as Good explains, sex is a different story because “The sex drive demands an outlet, especially with the young men. It cannot be stopped. Thus the personal and social constraints have less force; they’re more readily disregarded.” As a consequence women are often raped, an act they themselves must keep secret for fear of retaliation from their husbands against them. If the wife is young and childless, “the husband might find he cannot tolerate it; he might lose control utterly and embark on violent action. He badly wants to at least get his family started himself, rather than have someone else make her pregnant.”53
Chagnon’s ethnography of the Ya
nomamö people is a case study in the application of fuzzy logic to human nature. In Ya
nomamö Chagnon notes that the variation in violence observed by different scientists can be accounted for by a concatenation of intervening variables, such as geography, ecology, population size, resources, and especially the contingent history of each group, where “the lesson is that past events and history must be understood to comprehend the current observable patterns. As the Roman poet Lucretius mused, nothing yet from nothing ever came.”54
Many other anthropologists who have studied the Ya
nomamö corroborate Chagnon’s data and interpretations. Even at their “fiercest,” however, the Ya
nomamö are not so different from many other peoples around the globe (recall Captain Bligh’s numerous violent encounters with Polynesians and Captain Cook’s murder at the hands of Hawaiian natives), even when studied by tender-minded, nonfierce scientists. Evolutionary biologist Jared Diamond, for example, told me that he found the role of warfare among the peoples of New Guinea that he has studied over the past thirty years quite similar to Chagnon’s depiction of the role of warfare among the Ya
nomamö.55
Finally, if the last five thousand years of recorded human history is any measure of a species’ “fierceness,” the Ya
nomamö have got nothing on either Western or Eastern “civilization,” whose record includes the murder of hundreds of millions of people. Homo sapiens in general, like the Ya
nomamö in particular, are the erotic-fierce people, making love and war far too frequently for our own good as both overpopulation and war threaten our very existence.
Despite Patrick Tierney’s claim that Chagnon has exaggerated the level of aggression and rape among the Ya
nomamö, Kenneth Good documented both, here showing “two men duel over the infidelity of one of their wives.” (Courtesy of Kenneth Good) (bottom) Many Ya
nomamö men have deep scars on their heads from such battles. (Courtesy of Napoleon Chagnon)
In 1670, the British poet John Dryden penned this expression of humans in a state of nature: “I am as free as Nature first made man / When wild in woods the noble savage ran.” A century later, in 1755, the French philosopher Jean-Jacques Rousseau canonized the noble savage into Western culture by proclaiming, “nothing can be more gentle than him in his primitive state, when placed by nature at an equal distance from the stupidity of brutes and the pernicious good sense of civilized man.” From the Disneyfication of Pocahontas to Kevin Costner’s ecopacifist Native Americans in Dances with Wolves, and from postmodern accusations of corrupting modernity to modern anthropological theories that indigenous people’s wars are just ritualized games, the noble savage remains one of the last epic creation myths of our time. Within this myth lies the antithesis of the myth of pure evil, and that is the myth of pure good. The latter is just as detrimental toward a deeper understanding of human moral nature as is the former. The evidence from all the human sciences overwhelmingly supports the view that humans are good and bad, cooperative and competitive, selfish and altruistic. The potential for the expression of both moral and immoral behavior is built into human nature. How, when, and where such behaviors are expressed depends on a host of variables. But the myth of the noble savage extends far beyond what Rousseau envisioned and is still embraced today by many scientists, academics, and social commentators in what I call the Beautiful People Myth (BPM). The BPM is the fable of pacifist and ecofriendly humans ruined only by the plight of modernity and the burden of Dead White European Males. I have characterized the myth in the following manner:
Long, long ago, in a century far, far away, there lived beautiful people coexisting with nature in balanced ecoharmony, taking only what they needed, and giving back to Mother Earth what was left. Women and men lived in egalitarian accord and there were no wars and few conflicts. The people were happy, living long and prosperous lives. The men were handsome and muscular, well coordinated in their hunting expeditions as they successfully brought home the main meals for the family. The tanned, bare-breasted women carried a child in one arm and picked nuts and berries to supplement the hunt. Children frolicked in the nearby stream, dreaming of the day when they too would grow up to fulfill their destiny as beautiful people.
But then came the evil empire—European White Males carrying the diseases of imperialism, industrialism, capitalism, scientism, and the other “isms” brought about by human greed, carelessness, and short-term thinking. The environment became exploited, the rivers soiled, the air polluted, and the beautiful people were driven from their land, forced to become slaves, or simply killed.
This tragedy, however, can be reversed if we just go back to living off the land where people would grow just enough food for themselves and use only enough to survive. We would then all love one another, as well as our caretaker Mother Earth, just as they did long, long ago, in a century far, far away.
Into the Heart is a moving love story between anthropologist Kenneth Good and a young Ya
nomamö woman named Yarima. They eventually married, had children, and returned to the United States. Yarima grew bored with American life and returned to the more stimulating environment of Amazonia. (Courtesy of Kenneth Good)
I have thoroughly deconstructed and debunked the Beautiful People Myth elsewhere, so I will not belabor the point here.56 When it comes to how humans treat other humans and the environment, the Beautiful People have never existed except in myth. Humans are neither Beautiful People nor Ugly People, in the same way that we are neither moral nor immoral in some absolute categorical sense. Humans are only doing what any species does to survive; but we do it with a twist (and a vengeance)—instead of our environment shaping us through natural selection, we are shaping our environment through artificial selection. In a fascinating 1996 study, for example, University of Michigan ecologist Bobbi Low used the data from the Standard Cross-Cultural Sample to test the hypothesis that we can solve our ecological problems by returning to the mythological Beautiful People’s attitudes of reverence for (rather than exploitation of) the natural world, and by opting for long-term group-oriented values (rather than short-term individual values).57 Her analysis of 186 hunting-fishing-gathering (HFG) societies around the world showed that their use of the environment is driven by ecological constraints and not by attitudes, such as sacred prohibitions, and that their relatively low environmental impact is the result of low population density, inefficient technology, and the lack of profitable markets, not from conscious efforts at conservation. Low also showed that in 32 percent of HFG societies, not only were they not practicing conservation, environmental degradation was severe; again, it was limited only by the time and technology to finish the job of destruction and extinction.
Extending the analysis of the BPM to other areas of human culture, UCLA anthropologist Robert Edgerton surveyed the anthropological record and found clear evidence of drug addiction, abuse of women and children, bodily mutilation, economic exploitation of the group by political leaders, suicide, and mental illness in indigenous preindustrial peoples, groups not contaminated by Western values (allegedly the source of such “sick” behavior).58
Anthropologist Shepard Krech analyzed a number of Native American communities, such as the Hohokam of southern Arizona, and discovered that a large-scale irrigation program led to the salinization and exhaustion of the Gila and Salt River valleys, ultimately triggering the collapse of their society. Krech says that even the reverence for big game animals we have been led to believe was ingrained into the world-view of America’s indigenous peoples is a myth. Many, if not most, Native Americans believed that common game animals such as elk, deer, caribou, beaver, and especially buffalo are replenished through divine physical reincarnation. Game populations bounced back after successful hunts not because Native Americans made it happen through ecological veneration, but because they believed the gods willed it.59 Given the opportunity to overhunt big game animals, Native Americans were only too willing to do so.
One of the most poignant examples of this is the famous “Head-Smashed-In” buffalo kill site in southern Alberta, Canada. I had an opportunity to visit Head-Smashed-In (the name alone belies the Noble Indian myth). It is a most dramatic site. Standing on the edge of the cliff, one looks down upon a thirty-foot-thick deposit of buffalo bones that reflects five thousand years of Native American mass hunting. Looking back away from the cliff, one sees a vast and expansive V-shaped valley in which the hunters ambushed and drove their game for tens of miles. The terrain is on a slight decline toward the cliff, so these massive animals built up so much speed that upon reaching the cliff they were unable to stop themselves. They tumbled over, one after another, until there were so many carcasses that most were left unused. Buffalo populations were ultimately stable not because of a Native American conservation ethic, but because they simply did not have the numbers and technology to drive these big game animals into extinction. Other species were not so fortunate. The evidence is now overwhelming that woolly mammoths, giant mastodons, ground sloths, one-ton armadillo-like glyptodonts, bear-sized beavers, and beefy saber-toothed cats, not to mention American lions, cheetahs, camels, horses, and many other large mammals, all went extinct at the same time that Native Americans first populated the continent in the mass migration from Asia some 15,000 to 20,000 years ago. The best theory to date as to what happened to these mammals is that they were overhunted into extinction.60
The evidence is overwhelming that violence, aggression, and warfare are part of the behavioral repertoire of most primate species. While most conflicts among monkeys end relatively peacefully, this is due primarily to the fact that they lack brute strength and deadly weapons. In their stead, screams, gestures, pushing, hitting, and biting result in struggles for and changes in social status and mate choices, but it is clear that the potential for deadly violence exists. Among the great apes it was long believed that “only man kills,” but that is no longer the case. Murderous raids among chimpanzees have now been well documented, and they are not rare. While it would be inappropriate to compare the gang raids among chimpanzees to the wars of modern civilization (chimpanzee gangs mostly attack individuals or much smaller groups), the basic process of a gang of young and aggressive males fanning out into neighboring environments on a seek-and-destroy mission to gain resources and females is apparent in the species, genus, and family. As Jane Goodall famously observed: “If they [chimpanzees] had had firearms and had been taught to use them, I suspect they would have used them to kill.”61
Even when anthropologists have admitted that there is evidence for prehistoric human warfare, they often portray it as rare, harmless, and little more than ritualized sport. Now even that noble image has taken a major hit from new data. For example, in his survey of and comparison between primitive and civilized societies, University of Illinois anthropologist Lawrence Keeley demonstrates that prehistoric war was, relative to population densities and fighting technologies, at least as frequent (as measured in years at war versus years at peace), as deadly (as measured by percentage of conflict deaths per population), and as ruthless (as measured by the killing and maiming of noncombatant women and children) as modern war.62 At a bone bed site at Crow Creek in South Dakota, dated in the pre-Columbian fourteenth century, Keeley also found “the remains of nearly 500 men, women, and children. These victims had been scalped, mutilated, and left exposed for a few months to scavengers before being interred.” Keeley also recounts the last moments of a young man who was shot in the back during a mass raid of a Neolithic village in Britain, during which he fell and crushed the infant he was carrying. In yet another example, seven thousand years ago in Talheim, Germany, a band of thirty-four adults and children were murdered by blows to the head and then tossed into an open pit, not so different from what the Nazis did many millennia later.63 The Nazis had no monopoly on violence.
In Constant Battles, an exceptionally insightful study of this problem by Steven A. LeBlanc, the Harvard archaeologist quips, “anthropologists have searched for peaceful societies much like Diogenes looked for an honest man.” That is, they are exceptionally rare. “In spite of the presumption that most societies were peaceful in the past, anthropologists have had a lot of trouble finding ethnographically known peaceful people. Despite all the effort that has been devoted to the search, the number of what can be considered classic cases of peaceful societies is quite small.”64 Consider the evidence from a 10,000-year-old Paleolithic site along the Nile River: “The graveyard held the remains of fifty-nine people, at least twenty-four of whom showed direct evidence of violent death, including stone points from arrows or spears within the body cavity, and many contained several points. There were six multiple burials, and almost all those individuals had points in them, indicating that the people in each mass grave were killed in a single event and then buried together.”65 LeBlanc presents evidence from a site in Utah that contains the remains of ninety-seven people killed violently: “six had stone spear heads in them … several breast bones shot through with arrows and many broken heads and arms … . Individuals of all ages and both sexes were killed, and individuals were shot with atlatl darts, stabbed, and bludgeoned, suggesting that fighting was at close quarters.”66
LeBlanc’s survey of our not-so-noble past reveals that even cannibalism, long thought to be a form of primitive urban legend and a myth to be debunked (noble savages would never eat each other, would they?), has now been supported by powerful physical evidence that includes broken and burned bones, bones with cut marks, bones broken open lengthwise to allow access to the marrow, and bones broken to fit inside cooking jars. Such evidence for prehistoric cannibalism has been uncovered in Mexico, Fiji, Spain, and other parts of Europe. The final (and gruesome) proof came with the discovery of the human muscle protein myoglobin in the fossilized human feces of a prehistoric Anasazi pueblo Indian.67
Savage yes. Noble no.
We are moral animals, yes, but we are also immoral animals, tragically but indubitably so. Figures 14 through 17 graphically depict this hard and factual reality of the human condition.
Figure 14. Political Organization and Frequency of Warfare
The level of warfare for ten organized states, six chiefdoms, twenty-five tribes, and nine bands. (Derived from Table 2.1 in Lawrence Keeley, War Before Civilization, 1996)
Deaths due to warfare as a percent of all deaths recorded in representative societies. Ancient societies: Northern British Columbia, British Columbia, Southern California, Central California. Western Europe is for the seventeenth century; the United States and Europe are for the twentieth century. (Data from Table 6.2 in Lawrence Keeley, War Before Civilization, 1996)
Figure 14 presents Keeley’s data showing that there is no obvious distinction between bands, tribes, chiefdoms, and states in terms of warfare frequency—humans typically and frequently solve social disputes with violence, prehistorically, historically, and today—and they do so regardless of the political structure. Figure 15 shows, counterintuitively, that if there are any historical trends it is that the death rate as a result of warfare is actually decreasing over time, with modern Western states representing the lowest death rate and premodern political organizations the highest. Figure 16 depicts the different sites within Native American Southwest cultures in which evidence of violent deaths, processed human remains, and mutilations of the dead occurred. These are not isolated events, since the sites studied include the remains of hundreds of murdered individuals. Figure 17 depicts the data from R. N. Holdaway’s study of the extinction of New Zealand moa birds after the arrival of Polynesian Maoris, surely the perfect image of Beautiful People by anyone’s standard in the West. When Europeans arrived in New Zealand in the 1800s, they found bones and eggshells of large extinct moa birds, an ostrichlike bird of a dozen different species ranging in size from three feet tall and 40 pounds to ten feet tall and 500 pounds. We now know, from preserved moa gizzards containing pollen and leaves of dozens of plant species and from archaeological digs of Polynesian trash heaps, that the Maoris committed a full-scale ecocide. Although some biologists have suggested a change in climate as the cause of the moa extinction, Jared Diamond makes the case that when the extinction occurred, New Zealand was enjoying the best climate it had had in a long time. Also, carbon-dated bird bones from Maori archaeological sites prove that all known moa species were still present in abundance when the Maoris arrived around 1000 c.E. But by 1200 C.E.—six centuries before the arrival of Europeans—they were all gone. The final piece of evidence of what happened to the moas came from archaeologists who uncovered Maori sites containing between 100,000 and 500,000 moa skeletons, ten times the number living at any one time. In other words, they had been slaughtering moas for many generations, until they were all gone in a mass extermination.68
The number of sites in the American Southwest showing excavated evidence of violent deaths, processed human remains, mutilations of the dead, and sieges. These sites include the remains of at least 128, 252, an unknown number, and 174 individuals, respectively. The “processed remains” represent individuals who were apparently butchered for consumption. (Data from Steven LeBlanc, Prehistoric Warfare in the American Southwest, 1999)
There is nothing beautiful about the Beautiful People. Give them the plants, animals, and technologies—and the need through population pressures—to exploit their environment and they would do so; indeed, those that had that particular concatenation of elements did just that. In other words, centuries before and continents away from modern economies and technologies, and long before European White Males (dead or alive), humans consciously and systematically destroyed each other and their environments. The ignoble savage lies within.
The results represent modeling the time to extinction of the New Zealand moas after the arrival of the Maoris in the late thirteenth century. The model assumes 100 initial settlers, that no moa eggs were taken, and that no juvenile moas were killed. The model includes two moa consumption rates, two rates of human population growth, and whether moa habitat was destroyed. The light gray bar at the far right shows the results for the most likely of the tested conditions: 200 initial settlers, a 2.2 percent human population growth rate, and habitat destruction. (Data from R. N. Holdaway and C. Jacomb, “Rapid Extinction of the Moas” in Science, vol. 287, 2000)
Now that we have dispensed with the myths of pure evil and pure good, with what are we left? What remains when we strip away the mythic fog that has for too long shrouded human nature is human behavior—the things people do. So, a final way to view the visage of humanity is to think about human behavior not as inherently good or evil, moral or immoral, but as actions that we like and actions that we do not like, as these actions may be provisionally defined and judged. That is, in most circumstances, for most people certain behaviors most of the time are considered moral or immoral, and we reward and punish those actions accordingly. Our political constitutions are formed according to our natural constitutions. As Preacher Casey tells Tom Joad in Steinbeck’s The Grapes of Wrath, after he explains that he has given up holding revivals because of the obvious hypocrisy between the content of his preaching about fidelity and context of his own infidelities:
I says, “Maybe it ain’t a sin. Maybe it’s just the way folks is.” Well, I was layin’ under a tree when I figured that out, and I went to sleep. And it come night, an’ it was dark when I come to. They was a coyote squawkin’ near by. Before I knowed it, I was sayin’ out loud … “There ain’t no sin and there ain’t no virtue. There’s just stuff people do … . And some of the things folks do is nice, and some ain’t nice, but that’s as far as any man got a right to say.”69
This is not in any way to endorse a purely situational ethics or relative morality. The stuff some folks do really ain’t nice in most circumstances, to most people, most of the time. But there is no such thing as pure sin or pure virtue, any more than there is pure evil or pure good. The purpose of this exercise in ethical debunking is to shift the focus from sin and virtue, evil and good as metaphysical Platonic essences to quantifiable human behaviors that can be scientifically studied, causally understood, and ultimately modified or dealt with according to the needs and dictates of society. Evil forces do not exist, but evil acts are an all-too-human expression. Walt Kelly’s cartoon character Pogo put it simply: “We have met the enemy and he is us.”70
Aleksandr Solzhenitsyn put it more elegantly in his analysis of the gulags of the Soviet Union, surely a den of evil if ever there was one: “If only there were evil people somewhere insidiously committing evil deeds, and it were necessary only to separate them from the rest of us and destroy them. But the line dividing good and evil cuts through the heart of every human being. And who is willing to destroy a piece of his own heart?”71
Surely none of us, as Aeschylus suggested in Prometheus Bound:
Prometheus, Prometheus,
hanging upon Caucasus
Look upon the visage
Of yonder vulture:
Is it not thy face,
Prometheus?72
But remember, it was Prometheus who brought us knowledge, and through knowledge comes power, including the power of cultural amity to override natural enmity. We may always live in a world with walls; in recent history, however, the stone and mortar walls enforced by men with guns are gradually being replaced by invisible boundaries enforced by social contracts; in the future even these invisible boundaries may be replaced by semipermeable lines of demarcation, kept open through negotiation and cooperative exchange. It is a visage worthy of humanity.