Monday, November 30, 2015

Truth: Does Anyone Give a Damn?


The never-ending saga of the American political campaign for the presidential nomination has revealed (or newly emphasized) some amazing fractures where truth is concerned. Each week, it seems, one of the Republican aspirants for the crown issues a new “whopper” that the major media organizations still seem loathe to call a “lie.” Leading the pack, of course, is Donald Trump, the man who seems to have no moral sense whatever. His latest whopper involves his claim that he personally “saw” thousands of Arabs cheering in Jersey City as they watched the twin towers fall on 9/11. Even faced with evidence that his claim is impossible—by both reporters and rivals like New Jersey governor Chris Christie—Trump has stuck to his story, this past week only modifying it some by saying that what he saw was on television, and then that millions of people around the globe are convinced that they saw it too. And yet, no one calls him a “liar,” which is what he so clearly is. Rather, the shock jocks like Rush Limbaugh back him up, saying his essential facts are “what everyone in the world knows.” Ben Carson, also vying for the nomination (though no one can really say why; Carson himself seems to think God has chosen him for the role), has also been caught in several ‘whoppers,’ especially regarding his youth. He claimed that, when young, he was a ‘bad boy’ who had stabbed someone, until this proved slightly exaggerated; he also claimed that he had been offered a scholarship to West Point, though everyone else knows that those who are admitted attend the military academy for free. Like Trump, though, Carson has refused to recant, and simply offers slightly amended versions of his whoppers. And Carly Fiorina, the one-time CEO of Hewlett-Packard (which nearly expired under her tenure), has famously insisted that she “saw” videos of feminists harvesting fetal brains, though no such videos have ever been located. She also continues to claim that 92% of the jobs lost under President Obama were lost by women—though the same statistic was once used by the Romney campaign until it was seen to be so obviously false the campaign abandoned it. And others in the Republican camp keep doing the same thing: uttering false statistics and faux facts that they refuse to recant. The reasons seems clear: none of the candidates seems to suffer in the polls because of such lies. Their supporters only double down, like the candidates, in their support and, also like them, attribute the criticism of their darlings to “liberal-left media bias.”
            What has happened? Do Americans care anymore about holding the aspirants to public office to something as arcane as truth? Do most contemporary Americans even have the capability to distinguish truth from lies? Or do most people now prefer what Stephen Colbert hilariously called “truthiness”—the feel in our gut that we are right, without the need for all that tedious evidence, and logic, and fact.
            These and other questions are very much at issue in Charles Lewis’ recent book, 935 Lies: The Future of Truth and the Decline of America’s Moral Integrity (Public Affairs: 2014). Lewis has been a journalist all his adult life, serving as an investigator/producer for Mike Wallace on 60 Minutes, and then founding several nonprofit organizations devoted to bolstering the battered profession of investigative journalism, most notably in 1989 when he founded The Center for Public Integrity. What he does in 935 Lies is to show—using some of the familiar cases of the last half-century marked by both government and corporate malfeasance (the Tonkin Gulf Resolution, based on a presidential lie; the Watergate scandal, based on countless presidential lies; the rush to war in Iraq, based on no less than 935 lies by Bush Administration officials claiming Saddam’s possession of Weapons of Mass Destruction as their casus belli; and the notorious history of tobacco industry denial and obfuscation of the clear link between tobacco smoke and lung cancer)—how both government and corporate lying has become more blatant and ubiquitous over the years. Indeed, Lewis himself had produced the 60 Minutes show on tobacco industry lying called “Tobacco on Trial,” in the course of which he was heavily pressured to kill or at least modify the story by the CEO of CBS, Laurence Tisch. Lewis prevailed in that battle, though a friend of his from ABC, the Emmy-award-winning Marty Koughan, was forced in 1994 to stop his investigation of the tobacco industry on the show Turning Point. Philip Morris had filed a $10 billion libel suit against ABC, and the network executives decided not only to kill the story, but, as part of a settlement, to publicly apologize to Philip Morris for its reporters’ attempt to tell the truth. That truth, as cited by Surgeon-General C. Everett Koop, is that due to the lies of the tobacco industry over the years, “100 million people around the world died from smoking-related illnesses in the 20th Century, according to the World Health Organization,” and that an additional one billion would die in this century. To sum it up, Lewis cites what Marty Koughan told the Washington Post in August 1995: With its lawsuit, Philip Morris had shown that “for a paltry $10 million or $20 million in legal fees…you can effectively silence the criticism” (139).
            This is really the key for Charles Lewis. The problem had morphed from the hiding or censorship of important information by the government, as in the Gulf of Tonkin Resolution (N. Vietnamese boats had not attacked American warships), or corporations as in the tobacco wars, to self-censorship by the major news media. Lewis saw this firsthand in his attempt to produce a 60 Minutes segment called Foreign Agent, in which he was exposing how former US officials cashed in on their political connections by working as lobbyists on behalf of foreign governments. He had focused on Pete Peterson, a former Commerce Secretary who had become CEO of the investment firm Blackstone, whose consultants were shown in the story to exemplify precisely the government-to-industry pipeline the story was featuring. But Peterson was a close friend of 60 Minutes’ creator, Don Hewitt, who told Lewis to ‘edit’ the story. Lewis finally agreed to the edit by substituting former Reagan budget director David Stockman (also with Blackstone) for Peterson. Though that didn’t solve all the problems or the blame laid on Lewis, even by Mike Wallace, the story aired as edited in the end. But Lewis knew he had had enough:
“I had had a jarring epiphany that the obstacles on the way to publishing the unvarnished truth had become more formidable internally than externally” (197).

In short, even before government or corporate pressures, the media were censoring themselves to avoid trouble and maintain cozy relationships with power. And the real conundrum of television news, according to Lewis, had become (or perhaps always was) the basic conflict between money and truth: “TV is an immensely powerful medium, but its potential to make astonishing sums of money is typically realized only by appealing to the lowest-common-denominator instincts of viewers.” The most famous instance of this was when legendary CBS producer Fred Friendly tried to broadcast live coverage of Senator J. William Fulbright’s 1971 Senate Foreign Relations Committee hearings about the problems in Vietnam. To do so, however, CBS would have to interrupt its profitable daytime reruns of I Love Lucy. Not surprisingly, CBS execs chose I Love Lucy. To have aired the Fulbright hearings over Lucy would have represented, as David Halberstam put it, “a higher price for democracy than most network executives would be willing to pay” (164).
            After quitting CBS in 1989, Charles Lewis founded The Center for Public Integrity, today the largest nonprofit investigative reporting organization in the world. It has not only received Pulitzer prizes, Polk awards, and countless other honors for its serious investigative reporting, but has also inspired other similar organizations to supplant the seriously curtailed reporting traditionally done by large newspapers—who can no longer afford it. Lewis is rather optimistic about these developments, including those on the Internet, and about the continuing ‘thirst’ for the general public to know the truth.
            I am not so sure. Consider what we began with: the serious decline in any price being paid by presidential aspirants who are caught in outrageous lies. The public simply does not seem to care if their public figures shade or distort or completely falsify the truth. This is partly because of the decline in investigative reporting that Lewis details in his book. It is also due to secrecy—the mania for classifying documents that Lewis also records: just recently, for example, the number of US Government documents classified has risen from 8.6 million in 2001 to 23.8 million in 2008, and then in two more years, to 76.7 million documents classified in 2010. All of these documents require security clearance of one degree or another, with 1.2 million Americans now having Top Secret clearance to read them. As Lewis puts it, “our citizenry is divided into two tiers—a small elite with access to inside knowledge about our government, and a vast lower echelon that is kept in the dark” (227-8). But what Lewis cited about the dumbing down of the media is also clearly a factor. Here, a recent article by Matt Taibbi has some very salient points to make.
            Taibbi’s article (in Rolling Stone, reprinted 27 November on Reader Supported News), responding to the same lies by Republican candidates cited above, is titled: “America is Too Dumb for TV News.” Like many of us, Taibbi is shocked by the apparent indifference of the public to the level of lying, and more, to the lack of any penalty paid by the liars when their lies are publicly exposed. As he puts it, it used to be that “if a candidate said something nuts…the candidate ultimately was either vindicated, apologized, or suffered terrible agonies.” As, for example, Al Gore did when confronted with his implication that he had invented the Internet. Now, however, politicians “are learning that they can say just about anything and get away with it.” They just blame the media and get to be heroes to their media-hating base, which is convinced that the liberal media is all “controlled by special interests” who want an established candidate as their nominee.
            But Taibbi goes deeper, into what major media in this nation have become—“a consumer business that’s basically indistinguishable from selling cheeseburgers or video games.” Where once TV news was dominated by Edward R. Murrow and real investigative reports on important issues of the day, now it’s nothing but “murders, bombs, and panda births, delivered to thickening couch potatoes in ever briefer blasts of forty, thirty, twenty seconds.”  The results are clear: When you make the news into this kind of consumer business, pretty soon “audiences lose the ability to distinguish between what they think they’re doing, informing themselves, and what they’re actually doing, shopping.” So it’s not just that at some point, TV executives decided that their audiences preferred I Love Lucy (and the money they coughed up for goods promoted there) to Fulbright Senate Committee Hearings on dull old Vietnam; it’s also that after many years of this, audiences simply no longer have the ability to distinguish between corporate-generated pseudo-news and real information necessary to a democracy. Between the truth as supported by facts, and the lies that stroke their preferences about the way things in the world are, or should be. Most people must know, by now, that what they’re seeing on TV commercials are lies—distortions of the truth, if not outright fabrications. The problem is, the commercial lies begin to bleed over into the news distortions. The news omissions. The news that becomes simply a slight makeover of whatever foreign or domestic position the opinion-makers hand out for us to believe.
            And perhaps it’s even more basic than that. It really becomes a question of how much truth we wish to know, or can bear to know. How much do we wish to know about the dire predictions regarding climate change? Regarding the complex mess in Syria deriving from our own foreign misadventures? Regarding the critical state of our oceans, or our garbage dumps, or what we’re doing to our own bodies, our own planet? About how we’re being manipulated daily by the most corrupt, murderous, money-hungry corporations on this planet, who have literally become governments unto themselves. About how the endless dramas on TV about cops and their glorious efforts to protect us are really meant to disguise how corrupt they truly are, how specifically designed to protect only certain segments of the population while controlling the ‘other’ segments, even unto their deaths? How much of this do we really want to know? Or would we rather tune in to I Love Lucy?
            In short, though we are surely the target of massive deception from above, we also collude in our own self-deception; the self-censorship we confront stems, at least in part, from our own willingness to leave the hard decisions to others. Trained by media to be spectators at the pro game rather than players in our own, to be watchers rather than actors or singers or protestors, we grow more and more content to let the world proceed along lines of least resistance and enjoy our mediated world, our pre-fabricated confinement, our gut-satisfying “truthiness” as Stephen Colbert would have it. That’s because knowing is hard, knowing takes effort, being aware of what is being done to us, and what we’re doing to ourselves forces us to have to think and consider changes. And change is hard. Sadly, the changes have been proceeding at an alarming pace beneath our level of awareness, and have gone very far indeed. Whether we, most of us, can ever get back to that thirst for real truth that Charles Lewis is still convinced has never left us, remains to be seen. But it surely appears, in this terrible season of idiot politics and faith-based slaughter, that the prospects are not very good.

Lawrence DiStasi

Wednesday, November 25, 2015

Thanksgiving


I’ve written about the myths and ‘real truths’ of Thanksgiving before (Nov. 27, 2009). But today, as I was rereading some of the more contemporary truths about the subsequent butchery of Native Americans that took place, I couldn’t help thinking about how this story of our American origins leads to uncomfortable parallels with the more modern butchery we like to deplore in the Middle East. We love to call the terrorists of ISIS, for example, ‘barbarians’ and ‘savages.’ And in truth, reading a piece today by David Remnick in the New Yorker (“Telling the Truth About ISIS and Raqqa”, reprinted on Reader Supported News), in which he interviews some exiled residents of the ISIS stronghold of Raqqa about the horrors that descended upon that previously civilized city when ISIS took over, those same adjectives flashed into my mind: bastards, savages, barbarians. But the truth is that they are not savages or barbarians. If there is any name that fits the fanatics of ISIS, it would have to be ‘fundamentalists.’ For they are, as countless histories have now detailed, adherents to the type of Islamic fundamentalism native to Saudi Arabia (one of their supporters), the branch called Wahhabism. When the ISIS leaders—both remnants of Al Quaeda in Iraq and many of the former Sunni leaders of Iraq including army officers quite skilled in combat—took over Raqqa, they immediately named it the center of the caliphate that Wahhabism calls for: a place where the most radically fundamentalist ideals of Islam can be implemented and spread throughout the world, including the public execution of infidels.
As the former residents of Raqqa describe it, the savagery instituted by the fanatics of ISIS horrified everyone:

 The first crucifixion came early that spring—a horrific event to recall even now. Everyone at the table remembered the shock of it. Then came more: two people, shot in the head by ISIS executioners, crucified, and left for days for all to witness in the city’s main traffic roundabout.

According to the testimony, this was a kind of violence never seen before: crucifixions, cutting off heads, brutally-enforced edicts against forbidden pleasures. The repression and violence against women was especially vicious:

      There were edicts against drinking and smoking. Enforced by an all-female morality police called the Khansaa Brigade, women were made to wear the veil and, eventually, black shoes only. They are beaten if their niqab is somehow too revealing, a veil too flimsy, or if they are caught walking on the street alone.

Children, too, are targeted. Regular schools are closed in favor of ISIS religious institutions teaching them “the most fanatical form of the faith.” Then, many are lured or kidnapped and sent to military camps to learn how to fight and kill, “how to make and carry bombs. At their graduation, they have orders to execute someone––sometimes a beheading, sometimes they just cut off the head of a sheep.” As to the journalists of the publicity groups being interviewed by Remnick, the R.B.S.S. or Raqqa is Being Slaughtered Silently, they are hounded and tracked via the internet where they post much of their work. One has already been captured, caught at a checkpoint and executed three weeks later in Raqqa’s public square.
      Of course, it goes without saying that ISIS has already proven that when attacking ‘infidels,’ which is to say, the rest of the world, they are as ruthless and callous about killing innocents as any group in history. The attacks on Paris, the bombing of the Russian airliner with 224 people aboard, the suicide attacks in Lebanon and Iraq and Mali, and the public beheadings posted on the internet, are all grim proof of that. But what strikes me today is that the comparison to other fundamentalist groups comes right back to America, to our founding myths and those who are celebrated in them.
      The Pilgrims, for example, are the historical heroes of the alleged First Thanksgiving. It has become part of the legend that the Pilgrims—those few who survived the first winter in their new-found paradise near Plymouth—held a thanksgiving feast to celebrate the corn harvest that had kept them alive, and invited the local Indians to share it with them. But it is important to know that these Pilgrims were themselves an extreme branch of the Puritan sect that had waged revolutionary war against the Crown in their native England (the Puritans triumphed for a short time in 1653), and had fled to the New World for safety and to erect a new government based on their rigidly fundamentalist beliefs (including a coming Armageddon). Chuck Larsen, basing much of his summary on the work of Francis Jennings in The Invasion of America, has written about their fanaticism as follows:

They strove to "purify" first themselves and then everyone else of everything they did not accept in their own interpretation of scripture. Later New England Puritans used any means, including deceptions, treachery, torture, war, and genocide to achieve that end.(4) They saw themselves as fighting a holy war against Satan, and everyone who disagreed with them was the enemy. This rigid fundamentalism was transmitted to America by the Plymouth colonists, and it sheds a very different light on the "Pilgrim" image we have of them. This is best illustrated in the written text of the Thanksgiving sermon delivered at Plymouth in 1623 by "Mather the Elder." In it, Mather the Elder gave special thanks to God for the devastating plague of smallpox which wiped out the majority of the Wampanoag Indians who had been their benefactors. He praised God for destroying “chiefly young men and children, the very seeds of increase, thus clearing the forests to make way for a better growth.” (Chuck Larsen, “Introduction for Teachers,” http://www.manataka.org/page269.html).

Furthermore, the Indian hero, Squanto, supposedly one of those who helped arrange the feast, managed to be there because he was one of the few Indians who had survived both slavery in England and the above-mentioned smallpox epidemic that decimated the natives after an initial contact with English explorers in 1614. And subsequent “Thanksgivings” celebrated by the now-dominant Puritans of the Massachusetts Bay Colony rather stain the myth of generous settlers sharing a meal with happy natives. In 1637, for example, near Groton, CT, “nearly 700 men, women and children of the Pequot Tribe had gathered for their annual Green Corn Festival”—the actual precursor of Thanksgiving. What happened next was a horror:

                           In the predawn hours the sleeping Indians were surrounded by English and Dutch mercenaries who ordered them to come outside.  Those who came out were shot or clubbed to death while the terrified women and children who huddled inside the longhouse were burned alive. The next day the governor of the Massachusetts Bay Colony declared "A Day Of Thanksgiving" because 700 unarmed men, women and children had been murdered. (Susan Bates, “The Real Story of Thanksgiving,” http://www.manataka.org/page269.html.
This massacre-cum-Thanksgiving was followed by others, like a similar massacre of Pequots near Stamford CT where severed heads were kicked around like soccer balls, and the similarly savage King Philip’s War, which essentially ended the presence of Native Americans in New England. Any who were left either fled to Canada, or were rounded up and sold into slavery in the Carolinas (thus providing the inspiration for the Bostonians to troll in Africa for other slaves to fatten their purses and curse the new nation for generations to come).
            In sum, for the life of me, I can’t figure out how to distinguish our ‘noble’ ancestral founders from the fanatic fundamentalists now ravaging Syria, Iraq and the entire Middle East—nor, for that matter, the crazy fundamentalists now running for president on the Republican ticket. All are certain of their possession of the truth. All are convinced that the only way to rid the world of ‘infidels’ or ‘unbelievers’ or ‘sinners’ against their ‘god-given’ law is to either convert them or kill them. And the sad thing is that everywhere one finds these purists, these soldiers in the war to institute their precious god’s kingdom on earth, killing seems to be the preferred option.
Lawrence DiStasi

Sunday, November 15, 2015

Differential Responses to Terror


Like almost everyone else, my mind is swamped with images and thoughts about the vicious attacks in Paris on Friday night. With Parisians out for a night on the town, including thousands attending a soccer match, the eight or more terrorists picked so-called ‘soft’ targets in a relatively small, hip area and attacked restaurants, a concert, and parts of the above-mentioned soccer match. Scenes routinely described as ‘scenes of horror’ ensued, with the concert venue the most revolting: terrorists armed with AK-47s fired randomly and coldly at hundreds of concertgoers below them, and then, when about to be eliminated by the police, blew themselves up with suicide vests. The only comparison that comes to mind is the similar scene in a movie theater in Aurora, CO, when the American James Holmes randomly shot and killed twelve theatergoers at a showing of a Batman movie (sadly, no one vowed war on the NRA as a response). But of course, we have a rich field from which to choose for horror in our time: the Russian plane that exploded over the Sinai, killing all 224 passengers aboard; the suicide attack in a Beirut suburb where nearly 40 people were killed; countless suicide bombings and shootings in Afghanistan and Iraq, both still reeling and broken after the U.S. shock-and-awed them in the wake of 9/11; the recent U.S. attack in Kunduz province in Afghanistan, an attack this time on a hospital run by Doctors Without Borders. There seems to be no inhibition whatever, in our time, that prevents the murder of innocents in any place and at any time.
            What interests me here is the way we, especially we in the western world, respond to these horrors. Our response is, of course, a major part of the calculus of the terrorists who perpetrate such attacks. They know that though death hardly registers in our consciousness when it is ‘other’ innocents who are slaughtered—as, for example, when over 2,000 Gazans, mostly helpless civilians, were killed by the Israeli military in its most recent onslaught on that tiny strip of misery—the death of our people, of white Europeans or Americans in our ‘homeland’, is greeted with terror, with horror, with outrage, with cries that such barbarity must be avenged, must be repaid tenfold. These are exactly the sentiments coming out of France at the moment. France’s president, Francois Hollande, has declared that ‘this is war.’ And realistically, who could blame him? After the attacks on Charlie Hebdo earlier this year, after 1200 or more French citizens have joined the Islamic State in Syria, and now, after this attack, the French are legitimately feeling that they have been specially targeted by the ruthless fanatics who run ISIS. Though the said fanatics would no doubt prefer to attack New York or Los Angeles, they apparently have concluded that Paris is a more reachable, ‘softer’ target. They seem to think that this will discourage Europeans, and somehow persuade them to pull back from their participation in American-led attacks against them in Syria. That this is delusional, that their entire fundamentalist, apocalyptic mode of thinking is insane, does not seem to matter. Or rather, in a certain sense it matters most of all: such people, convinced that the world is ending anyway, seem to figure that dying a little sooner than the rest of us confers glory on them, not least because it will help bring on the apocalypse they yearn for.
            But I digress. What I really mean to focus on is how our responses to death, to the sudden death of innocents brought on by the terror of modern weaponry, differ, depending on who does the killing and who does the dying. Consider the response to the recent downing of the Russian passenger liner over the Sinai desert. It was loaded with 224 tanned vacationers returning from some time in the sun at the Egyptian resort of Sharm el-Sheikh. Initially, and still to this day, Egyptian officials have refused to even call it a terror attack, Ayman El-Muqadem insisting that an explosion could have occurred in several other ways, including “lithium batteries in the luggage of one of the passengers, an explosion in the fuel tank, fatigue in the body of the aircraft, or the explosion of something.” (Christian Science Monitor, Nov. 8, 2015). Increasingly, however, most countries are subscribing to the terrorist theory: that a bomb was smuggled onto the plane and its explosion brought the plane down. Whatever the cause turns out to be, the interesting thing is that almost no western journalists have rushed to Moscow or the Sinai to record tearful reactions from Russians who lost more loved ones than the French (as countless journalists like Katie Couric are doing in Paris). The airliner attack is either treated routinely, as ‘just one of those things,’ or even as something deserved. Russia, after all, remains our chief rival and increasingly our renewed enemy, with its leader, Vladimir Putin, characterized more and more as, if not quite a Hitler, then close.  He keeps interfering in our global plans and machinations, such as the coup in Ukraine (right on Russia’s border, it should be noted, and hence well within what we like to call a ‘sphere of influence’ when it’s in our hemisphere), and now in Syria (also very much closer and threatening to Russia than to the United States).  So when the Islamic State recently claimed responsibility for bringing down the Russian passenger liner, attributing it to Russia’s recent bombing campaign against them in Syria (and in support of Assad), one could almost hear the ‘served-them-right’ murmurs in the western camp.
            The same holds for the recent suicide bombing in the Hezbollah-dominated suburb outside Beirut. We saw evidence of the explosion, we heard a few screams, but underlying all the coverage was, again, a certain suppressed gloating. Hezbollah, after all, has been supporting the evil Bashar al-Assad, our latest candidate for Hitler’s mantle. Those who support Assad, such as Hezbollah and Iran and Russia, become, ipso facto, our enemies. So if even ISIS, supposedly the most mortal of our mortal enemies, suicide-bombs civilians targets allegedly controlled by Hezbollah, then that is a plus in our ledger. Any deaths that come as a result are to be lamented on the surface, perhaps, but secretly cheered.
            One could cite countless other terror attacks and a similarly muted response to them on our part. But consider what might be called terrorism but usually isn’t: the bombing of innocent civilians by the so-called “good guys.” The bombing of Gaza by the U.S.-supplied Israeli military comes immediately to mind. What else but terror can one call the relentless campaign of aerial and rocket bombing against a population imprisoned in the most densely populated piece of real estate in the world? What else but terror is the targeting of schools, of apartment complexes, of hospitals? But we don’t call it that, because the victims themselves are alleged to be terrorists or harboring terrorists, and the perpetrators are our close allies and hence experimenters with our own advanced weaponry to carry out what we call “retaliation.” Sadly, the same rationale is used to describe our recent ‘mistaken’ bombing of the Doctors Without Borders hospital in Kunduz. How could this happen? As with the Israelis in Gaza, we knew or should have known the well-publicized global coordinates of this hospital. And yet, our planes bombed the hospital anyway, and even worse, allegedly attacked wounded patients trying to flee. Doctors Without Borders and several other groups have insisted that this was a war crime: hospitals are supposed to be protected, immune from attack even in war zones. But the main response from U.S. officials has been expressions of ‘regret’ over the ‘mistake.’ And from the U.S. public? Well, it was, after all, ‘those people’ in a war zone, some of whom may have been terrorists themselves. No need to concern ourselves, except a little that our shiny reputation might be tarnished.
            So this is what we have. Death is regrettable, and awful, and tragic, and sometimes outrageous, but it usually depends on whose death is at issue. If it’s a relative or close friend or one of our in-group, it hits us very hard, especially if it seems it could or should have been prevented. If death happens to one of ours as part of organized warfare, then it’s also ‘tragic’ but expected, and can be dressed up, in the end, as part of a necessary and noble sacrifice. And if the dead are ‘theirs,’ even if they are civilians and hence ‘innocent,’ we find ways to tolerate the deaths we’ve caused, rationalize them as ‘collateral damage,’ part of the messy business of defeating an evil enemy.  But..if death comes as part of an attack on us or our friends, in a manner that we label ‘terrorist’ (notwithstanding the legitimacy of the attackers’ grievances and their relative powerlessness to express those grievances in conventional ways), then it becomes an outrage. Then the killing becomes ‘barbaric’, regardless of the proportion of the lives ‘they’ have ruined relative to those we have ruined. And there seem to be endless ways in which we parse out their barbarity, and our outrage, respecting those deaths. In other words, there are ‘rules’ to which we insist all combatants in a conflict must comply, our clearly-defined rules of war; our clearly-defined rules for the taking of prisoners; our rules respecting which areas or institutions are legitimate targets. That the rules (like the rules regarding money, interest, and bankruptcy) are usually made to favor the more powerful party is mostly ignored or suppressed. Rules are rules, after all. And what terror does to outrage and terrify us is violate the most fundamental of those rules, our rules. Terrorists do not fight fair. Terrorists pick victims at random with no concern for their guilt or innocence, and snuff them out for no legitimate reason. And when the victims are “our” people, then the randomness, the unfairness, the barbarity—regardless of the relative numbers involved—are all considered more extreme, more unfair, more beyond the pale of what we have decreed to be legitimate, than anything we do or could even conceive of doing.
            This is, of course, natural to most humans. Those in our group, those on our side (including God), are always considered to be more deserving, more valuable, more innocent than those on the side of the ‘other.’ To paraphrase Orwell, ‘all lives are valuable, but some lives are more valuable than others.’ Thus some deaths deserve to be lamented more than others. Some deserve to be grieved more than others. The useless waste of some lives deserves more attention than the useless waste of others. There may be no way to resolve this dilemma short term. But noticing it—especially before we rush off to scream for overwhelming and merciless retaliation—reflecting upon it, and eventually perhaps coming to see that the loss and waste of every life on every side is painful and deserving of our attention and our empathy, would certainly be worthwhile.

Lawrence DiStasi

Wednesday, November 11, 2015

Moral Nature


Many people, even today, like to see humans as something apart from the rest of the world—the special species that either has so far outdistanced our nearest relatives that comparisons are no longer useful or even possible; or the special species created by God in his own image and likeness, and therefore not even related to the rest of brute creation. Either view massages the human sense of ourselves as elevated, different, and with ultimate dominion over all our planetary co-habitants. Made by God or Nature to rule, we can do whatever we wish to not just the domestic animals we raise and manipulate for food, but to all wild animals as well. And what particularly gives us this sense of ourselves as unique and uniquely in charge is our morality. Though our bodies may run on the same kinds of energy and operate with the same kinds of cells and bodily structures and even brains common to most life forms, and though some primates may be able to communicate using signs we’ve taught them, no other animal has an even remotely comparable sense of fairness, of compassion, of justice. And all those abstract qualities depend chiefly on our highly developed sense of reason—on our brain-centered rationality. We alone can look at a problem, figure out its origins and causes, and come up with a rational solution.
            Now anyone who has been paying attention to science in recent years knows that such views have been getting a terrible buffeting since the days of Charles Darwin. Far from being separate, Darwin showed what subsequent scientists have filled out and firmed up: we descend directly from the rest of creation, and our distance from our nearest relatives, the great apes, seems ever more narrow. We share at least 98.7% of our DNA with bonobos and chimpanzees. Which, again, might be acceptable and leave us some “special” room if only we could still claim some distinction like morality. Alas, according to Frans deWaal, a researcher at Emory University’s renowned primate lab and author of The Bonobo and the Atheist: In Search of Humanism Among the Primates (Norton: 2013), this small comfort can no longer be maintained. According to deWaal, and several others he cites, recent research using neuroscience demonstrates that most of our vaunted moral and social distinctions can be found in similar form among bonobos or chimps, or both. Their roots can even be found among just about all mammals as well. What this means is that far from being a gift from the God we’ve been worshipping for several thousands of years in a variety of religions (that great theologian Ronald Reagan in 1984 said: “as morality’s foundation is religion, religion and politics are necessarily related…”), morality is actually built into our brains, especially our emotional brains.
            This seems to me to be good news. It explains that unlike many conservatives and religionists, who see humans as basically ‘bad’ and therefore in need of harsh rules and punitive controls to ‘civilize’ us, we humans, with our basic equipment deriving from apes, are programmed not only to be good but also to do good. deWaal quotes his colleague at Emory, James Rilling, to the effect that we have “emotional biases toward cooperation that can only be overcome with effortful cognitive control” (49). As deWaal explains it, this means “that our first impulse is to trust and assist; only secondarily do we weigh the option of not doing so, for which we need reasons.” Using brain-monitoring techniques, Rilling showed that when normal people aid others, “brain areas associated with reward are activated.” As deWaal succinctly puts it: Doing good feels good. To be sure, there are psychopaths in any population whose brains, for whatever reasons, lack these emotional rewards. But in a later section, deWaal follows Chris Boehm in surmising that evolution has probably worked to marginalize these outliers: the penalty, in social groups, for not cooperating can be harsh, ranging from ostracism to complete elimination—thus minimizing the propagation of those with such genes.
            What deWaal does in the rest of his book is to show that far from being imposed from above by a law-giving God, or from reasoning by deep-thinking philosophers, the moral law in fact “arises from ingrained values that have been there since the beginning of time” (228). This is especially true among animals who live and hunt cooperatively, like dogs, chimpanzees and bonobos:

The most fundamental one [i.e. value] derives from the survival value of group life. The desire to belong, to get along, to love and be loved, prompts us to do everything in our power to stay on good terms with those on whom we depend. Other social primates share this value and rely on the same filter between emotion and action [i.e. on inhibitions] to reach a mutually agreeable modus vivendi. (228).

It also applies to our Neanderthal relatives, who, recent fossil evidence shows, took care of the infirm (individuals afflicted with dwarfism, paralysis, or the inability to chew survived into adulthood) in the same way early humans did. Since Neanderthals lived hundreds of millennia before civilization and its gods, this means, again, that morality and its basis in empathy existed well before civilization itself. Further, since the evidence of empathy among primates is by now well-established (mirror neurons, the brain cells that mediate empathy and allow us to feel what another is going through, were first discovered in macaque monkeys), it seems quite clear that humans easily adopted the moral laws promoted by religions (like the ten commandments) primarily because they were already inclined to be moral. To cooperate. To help others in order that more of the group might survive. And, from the negative side, to avoid hurting others by inhibiting the impulse to do them damage. As deWaal sums it up, “a social hierarchy is a giant system of inhibitions, which is no doubt what paved the way for human morality, which is also such a system” (150).
            This latter point has great relevance for our time, it seems to me. For we are even now engaged in a great debate about what economic system best fits our human nature. Up till very recently, we have been told that capitalism is “natural,” that competition is “natural,” that the war of every individual against every other individual to monopolize resources is “natural.” But if our primate inheritance prizes cooperation, prizes helping others, inhibits us from hurting others, puts the welfare of the group or community above the individual impulse to harm or to hog everything to oneself, then this attribution of “natural” would seem to have serious shortcomings. Consider what deWaal writes about the ideas of fairness and justice, both fundamental components of any moral law. He first cites the “egalitarianism” of hunter-gatherer groups, where “hunters aren’t even allowed to carve up their own kill, in order to prevent them from favoring family and friends” (231). In other words, the inhibition against taking all for oneself or one’s family is a primary form of fairness—and ultimately, of course, a way to ensure not only that all members of the group get a share in any one individual’s luck, but also that reciprocity will dictate that the same fair division will happen when another individual brings home a kill. Many experiments have shown how ingrained this preference or insistence on an even split is, not only among humans, but also among our primate relatives. In one experiment, capuchin monkeys were playing a game, the reward for which was cucumber slices. All the monkeys were ok with this, and played the game. But when the experimenters started to reward some of the monkeys with grapes (a preferred food), the ones still given cucumbers vehemently protested, and indeed, refused to accept the cucumbers at all. They actually tried to destroy the whole game. Economists would call this refusal of perfectly good food “irrational;” but, as deWaal points out, “it is an irrationality that transcends species.” It is a deeply emotional fairness response that all primates exhibit, and that even dogs (also group hunters) exhibit as well. deWaal cites a finding by Friederike Range at the University of Vienna where “dogs refuse to lift their paw for a ‘shake’ with a human if they get nothing for it while a companion dog is rewarded” (234). deWaal summarizes these findings about fairness and justice as “ancient capacities,” which “derive from the need to preserve harmony in the face of resource competition” (234).
            Preserving harmony. The primacy of group welfare. Cooperation. Anger and refusal in the face of inequity. Suddenly it seems that much of the discontent coursing through modern societies is not derived from some outlandish and artificial notion of social justice and fair play. Suddenly it seems that humans, like all other primates, are primed to react emotionally—however irrational it may seem to some—to perceived unfairness and the unequal distribution of goods. When Occupy Wall Street protesters shout and protest about the 1 percent taking all the wealth, leaving the 99 percent with crumbs; when Bernie Sanders, running for president, rages about the obscenely unequal distribution of wealth in the United States; when fast-food workers demand a living wage in spite of their low-skilled jobs; when Greeks demand that their government refuse to pay off loans to predatory banks while cutting pensions and health care; when we all react with nausea when we read of corporate heads paying less in taxes than their secretaries; we should begin to see this not as illogical or irrational but as an upwelling from an ancient part of the primate brain that is built in to what it means to be human. And the constant harping on individuality and “looking out for number one”? That we should see as a regrettable leftover from outlier impulses that should have been, and should still be relegated to the genetic waste bin. Morals and the moral code itself derive from our deep inheritance as cooperating animals whose primary impulse is to get along in order to survive; to inhibit self-centered accumulation as destructive of group harmony; and to help others because helping others feels good.
            Only then, after we have caught up with our primate kin, might we be ready to fulfill our role as fully-human human beings. Which is to say, going even beyond the primate need for in-group survival and realizing, as only humans can, that we must share our empathy and our protection to all life forms that share this planet with us, that literally make it possible for us to even be here.

Lawrence DiStasi