Thursday, September 15, 2016


As the presidential election season enters its last two months, my mood and the mood of the country grow ever more dispirited. The reason is simple: the two major-party candidates, Hillary Clinton and Donald Trump, are the least-liked, least-admired, least-trusted presidential candidates in recent history. And everyone is feeling the depression that comes with facing a voting booth to select a candidate you would prefer were not even running, a candidate that you’re feeling compelled to vote for primarily in order to prevent the other even-more-hated choice from winning. We all feel that and more. There’s desperation mixed in too—the desperation of those who are convinced that a President Trump (or a President Clinton) will be a disaster for the nation. And for me, personally, there is the desperation that comes from contemplating a United States of America that is so impoverished of leadership, so cynical about elected officials at every level, that it has offered up two people out of the 300 million in this vast country who are more repulsive on almost every level than even the recent idiots who’ve run for the office. It is as if we were suddenly back in ancient Rome and the imperial line had thrust upon us a Caligula or a Nero or a Tiberius. But Rome by that time was no longer a republic; it had become an empire ruled by the imperial line beginning with Augustus. The state and its rulers were corrupted by absolute power and the license and perversion that often accompanies inbreeding. The United States, by contrast, is still nominally a democracy, a representative democracy promoting the myth that the people rule and the rulers are obliged to follow. And for a while, with the election of Barack Obama as the first black president, it almost seemed as if this really were the case. No more. Clinton comes from a presidential dynasty via her husband, who was president twenty years ago and first kowtowed to the oligarchs. And Donald Trump comes from a line of inherited wealth that allows him to trumpet himself as beyond the corruption of money.
            That’s not all. The mood of the country is more than dispirited; it’s ugly, it’s threatening, it’s beginning to seem akin to what pertained in 1930s Europe. Both sides—but especially conservatives—have retreated to a kind of unbridled rhetoric that has abandoned all sense of restraint or decency or balance. “Incendiary” only begins to describe it, and it begins with what Trump started out with: nasty, snarling attacks on immigrants, on Muslims, on the President, on his primary rivals, and now on his opponent. It’s as if we’re in some schoolyard brawl. And unfortunately, this type of rhetoric matters. It inflames supporters and encourages them to violent actions to match the violent rhetoric (several protesters have been beaten up at Trump rallies). While some Republicans feign outrage over Trump’s lack of restraint, its source can be traced back to the basic stance of the entire party in Congress which, from the moment Barack Obama took office, made no secret of its overriding strategy: to block anything that might make the first black president look effective. The disrespect for a sitting president (and the disrespect was a thinly disguised ‘dog whistle’ meant to signify contempt for the Black man in the White House) thereafter was built in. To get to Trump and his nasty, gutter-spawned generalizations was only a short leap. The point is that this stuff is contagious: and so we have had Hillary Clinton recently generalizing about Trump supporters as “deplorables”—racists, xenophobes, Islamophobes, and basically ‘white trash.’ Which they in large part are, of course. But it is not the kind of general condemnation that is likely to win over those voters with doubts about their candidate, or to calm the election waters.
            Now we have candidates actually making overt references to violence and bloodshed. First Donald Trump said that if Hillary Clinton is elected and gets to appoint federal judges, “there’s nothing you can do folks,” and then added this veiled appeal to rifle owners and assassination: “Although the Second Amendment people—maybe there is, I don’t know.” More recently we have had the governor of Kentucky, Matt Bevin, going even further to predict the dire effects if Hillary Clinton becomes president. In a speech to the Value Voters Summit on September 10 in Washington DC, Bevin’s answer to whether the country could recover from a Clinton presidency raised the specter of real bloodshed:
            “I do think it would be possible, but at what price? At what price? The roots of the tree of liberty are watered by what? The blood of who? The tyrants to be sure, but who else? The patriots. Whose blood will be shed? It may be that of those in this room. It might be that of our children and grandchildren. I have nine children. It breaks my heart to think it might be their blood that is needed to redeem something, to reclaim something that we through our apathy and indifference have given away” (Associated Press: Sept. 13).

            What the hell is going on here? This is not Germany after World War I. This is not Italy in the 1920s. But clearly, some people in these once-United States are feeling the kind of desperation that leads them to think in terms of problems so intractable, of marginalization so extreme, that American blood may have to be shed. Revolutionary blood? Civil War blood? Some further remarks by this alleged governor, Bevin, provide further clues to this perceived alienation: 

“Look at the atrocity of abortion, so many have remained silent. It’s a slippery slope. First we’re killing children, then it’s ‘don't ask, don't tell.’ Now it’s this gender-bending, don’t ask, don’t be a bigot, don’t be unreasonable, don’t be unenlightened, heaven forbid..” (Ibid.)

In other words, the rhetoric used for abortion—“we’re killing children”—has segued easily into the rhetoric of killing, of battle, of armed resistance, of revolution. People whose values have been violated by modern society—which provides choice for women to control their reproduction, and for gays to marry, and for transgender people to use public restrooms—are being encouraged to think only violence can be used to restore their dreamed-of place at the top of a society that used to deny such rights. That is really what’s at the core of this dispiriting mess. There are whole areas of the country, primarily in the South and the Midwest, populated by people who feel that their country is being stolen from them. Stolen primarily by people who should be at the bottom of the social ladder but whom government has given a helping and unfair hand to ascend. And their ascension seems to threaten the position of the mostly white populations who had long taken their social primacy for granted. The government that helps these ‘outsiders,’ these darker races, is therefore cast as the enemy. Its rules and regulations, its aid to ‘slackers’ and ‘illegals,’ its promotion of equal rights to those who by ‘inherited right’ should have no rights at all—all of these inflame a corrosive bile in the hearts of those who see themselves vanishing. And so they flock to the candidate whose posture seems to represent them, the candidates who use the violent rhetoric that they themselves use in private to curse those who are ‘stealing’ what they consider is rightly theirs: their place, their government, their country, their birthright as the ‘right’ kind and color of people.
            Where this will end is not clear yet. If the demographics are allowed to take their natural course, the people of color will in fact shortly outnumber the malcontents, and there will be no need for contests or debates. But it is not at all certain that demographics will be allowed to proceed unhindered, or without some blood being shed. And that is what is dispiriting in the end. As positions continue to harden, as the rhetoric gets more and more inflammatory, it becomes difficult to see how saner heads, as most often in the past, can prevail. And though to this observer, it is impossible to see how whole populations can be persuaded by the simple-minded lies of the demagogues, clearly large numbers can. The only question becomes, how far will the crazies go? How angry and belittled are these interior masses? How willing to water their personal tree of ‘liberty’ with blood?
            I have to tell you that at this moment, it doesn’t feel hopeful—not least because the only viable alternative is the other ‘hold-your-nose’ candidate.

Lawrence DiStasi

Monday, August 29, 2016

White Trash

Before readers get their drawers in an uproar, I should explain that my title merely repeats the title of a new book by historian Nancy Isenberg: White Trash: The 400-Year Untold History of Class in America (Viking: 2016). And a very fine book it is. I have to confess to an ingrained prejudice against the title group most typified by the Trump supporters who seem drawn in large part from this class. But reading Isenberg’s book opened my eyes to the long history of contempt, exploitation and prejudice faced by what may be a dying breed, and made me rather more sympathetic than I would otherwise be. To begin with, Isenberg takes great pains to show that the contempt for both America as a colony and the vagrants and beggars who would largely populate it had a long history in England. Starting from the earliest colonial forays, the English saw the new continent as “an outlet for the unwanted, a way to remove vagrants and beggars, to be rid of London’s eyesore population” (10). More than that, they saw America as not just a source of fertility and income, but moreso as a “place of outstanding wastes, ‘ranke’ and weedy backwaters, dank and sorry swamps” (10), literally a “waste-land” in itself and a convenient repository for “waste” people. Ship captains in London actually rounded up children from the streets of London to sell to planters across the ocean—it was known as “spiriting.” Those and others like them sold into indentured servitude were virtual slaves, unable to move or marry, subject to whippings, and able to be sold (the master could sell the indentured contract to another master.)
            Accordingly, early tracts on the new land, like a famous one from Richard Hakluyt, saw America as “one giant workhouse,” a place where the waste people of England “could be converted to economic assets. The land and the poor could be harvested together…” (21). The language used could be even more graphic: to cure the “plague” of poverty in England, the colonies were called “emunctories, excreting human waste from the body politic” (an emunctory is a human organ like a kidney that serves to carry off wastes). And what Isenberg is at great pains to demonstrate is simple: this English class system that saw the poor as societal “waste” to be excreted was replicated in the new world and the new nation. In Virginia, for example, a planter elite quickly took over the best lands, grew rich on tobacco, and exploited not just slaves but their indentured servants. A 1662 Virginia law stipulated that children remained servants until age twenty-four; servants were classified as “chattel, as movable goods and property,” essentially equivalent to livestock. The same social system pertained among the Puritans in Boston. Far from being the cradle of democracy (Governor Winthrop labeled democracy “the meanest and worst of all forms of government”), Puritan Boston was dominated by class divisions where children were essentially the servants of their father, and where the “first slave cargo arrived in 1638” (31). The big divider in Boston and elsewhere was land. The landless were without station, without election, without power of any kind; their only recourse was to escape (usually to the frontier). The same held in Virginia. “The most promising land was never equally available to all,” because the “royal surveyors made sure that large planters had first bids on new, undeveloped land, and so the larger tracts were increasingly concentrated in fewer hands” (37).
            This was dramatically clarified in Bacon’s Rebellion, which took place in 1676 in Virginia. Vagrants, frontiersmen, indentured servants and slaves all joined Bacon in rebellion, and were known to the powerful as “offscourings.” The word means “human fecal waste,” and it indicates both the contempt of the landowning elite for the marginal landless, and their fear that a united underclass might lead to bigger and far more lethal rebellions. One result was a hardening of racial lines and the more formalized class divisions in the South. Isenberg tells us about the Fundamental Constitutions of Carolina (1669), written by none other than that great hero of the founding fathers, John Locke. According to Isenberg, Locke’s tract not only endorsed slavery, it promoted “a semi-feudalistic and wholly aristocratic society” with noble titles taken from Germany (‘landgraves’), and categories of inherited servitude (“leet-men” were like serfs, tied to the land and their lord). Through it all—through the splitting of Carolina into North and South, with the South dominated by slavery; and the attempt by James Oglethorpe to create a slave-less colony in Georgia where poor whites could prosper with hard work—the contempt for and exploitation of poor whites continued. They were considered a new breed of human: lazy, slothful, interested only in procreating their miserable kind. Robert Byrd described Carolina as “lubberland,” a swamp of inferior, ungovernable people who had no desire or ability to make the land productive or profitable, and therefore literally “waste-land.”
            In this situation, the main solution for poor, landless whites was escape to the “unoccupied” lands of the west. Indeed, the populating of the lands from the Allegheny Mountains to the Mississippi River was done mainly by the white squatters who simply squatted on land, built a shack on it, and survived as best they could. It was, Isenberg reminds us, “a recapitulation of the English tactic of getting rid of its ‘waste’ by shipping it west to the colonies” (105)—in this case, the territories west towards the Mississippi. And the attitude toward this “waste people” was the same on the part of the landed elites in the colonies as it had been by the British earlier: contempt. These frontier squatters did eventually get their heroes and representatives. Andrew Jackson was probably the first of note, dubbed the backwoods president who represented ‘cracker country.’ Daniel Boone was another—a Congressional hero in a coonskin cap who epitomized the frontier virtues of fighting spirit, contempt for Indians, and a love of boasting. Isenberg quotes James Agee about this latter trait, said to derive from the shame ‘crackers’ were said to be covering: “The poor…have merely internalized a kind of ‘anesthesia,’ which numbs them against the ‘shame and insult of discomforts, insecurities, and inferiorities’” (228).
            Contempt or not, the elites were always ready to use the poor to fight in wars. Just as they formed the main battalions in the Revolutionary War, they were also the mainstay of the Confederate forces. Plantation owners were able to do this by redirecting “the hostility of the South’s own underclass, the nonslaveholding poor whites” (159) towards both the black slaves and the northerners trying to free them. Even though they themselves had been called the “degenerate race” before the war, now poor whites were told that Yankees were actually the degenerates, with an agenda not just of abolishing slavery, but “inciting class revolution in the South” (158). Such propaganda was necessary because it was common knowledge that this was “a rich man’s war and a poor man’s fight.” The rich could hire substitutes to fight in their place, while the poor were subject to a conscription that targeted all males from seventeen to fifty. As a result, there was strong resistance to the draft and to the war itself, with thousands deserting a Confederacy many felt little attachment to. Perhaps they knew what they were doing after all: in the reconstruction overseen by Pres. Andrew Johnson (himself a southerner), more whites than blacks actually got federal relief, with hundreds of thousands living off “Uncle Sam’s rations” (178).
            This may help us to understand how the current politics of “redirecting white hostility” continues to work. That is, despite the fact that in recent years white trash has become almost chic, a virtual ‘ethnic identity’—with Elvis Presley a culture hero; with Andy Griffiths a TV hero; with NASCAR an entertainment phenomenon that tops the charts; and with Lyndon Johnson and Bill Clinton winning landslide victories for the presidency—the Republican party, since at least Richard Nixon’s southern strategy, still depends on the white population in the solid south to win its elections. And it does this in the same way Confederate leaders did: it redirects white hostility towards northerners, “eggheads,” government regulators, and urban blacks. Lyndon Johnson, in one of his remarks to Bill Moyers, probably put it best:

 “If you can convince the lowest white man he’s better than the best colored man, he won’t notice you’re picking his pocket. Hell, give him somebody to look down on, and he’ll empty his pockets for you” (315).

In other words, poor whites, the ‘waste’ that the American Dream never seemed to reach, have always been exploited, one way or the other. But rather than directing their hatred at those who exploit them (land speculators or bankers or politicians) or the class system that keeps them at the bottom, the dispossessed hate those they can look down on (even if he’s President of the United States). And supplying someone to look down on seems to be part and parcel of the political fabric of America. This is evident in the campaign of Donald Trump in 2016; his most loyal supporters are those whose hatred he has been able to redirect, convincing them that he, a billionaire, is their ally in bitterness, in animosity towards the ‘government.’ In this sense—and we have no idea how successful Trump will eventually be in mining this traditional reservoir of Republican rancor—the Trump phenomenon validates Nancy Isenberg’s point in her book. Far from being some marginal aspect of the national story, the dispossessed have always been key to “our very identity as a nation” (320. They always have been, and still are, a fundamental part of our history—a history that tries to perpetrate the equal-opportunity-for-all myth, but which, in reality, becomes more and more dominated by inherited wealth and, yes, deep class divisions. As she puts it in her epilogue: though the labels have varied, from trash to wastrel to vagrant to squatter to cracker to trailer trash, “White trash is a central, if disturbing, thread in our national narrative” (321). It is the story of those who fail to rise in America and a far more important part of “who we are as a civilization” than most of us, especially our mainstream historians and politicians, have ever been willing to admit.

Lawrence DiStasi

Friday, August 5, 2016

Our Malevolent Doppelganger

When I first heard of Siddartha Mukerjee’s prize-winning book, The Emperor of All Maladies, I decided I wanted no part of it. Why read a book about cancer when we already hear far too much about this insidious horror of a disease? But then I read Mukerjee’s second book, The Gene, and realized that he is the latest in that enduring trend of physicians (William Carlos Williams, Lewis Thomas, Abraham Verghese) who are also brilliantly accomplished writers. So I decided to take a look at The Emperor of All Maladies (Scribner: 2010). I’m truly glad I did. Mukerjee is the real thing—a physician who seems to have maintained his humanity, his soul, his sensitivity to words, even as he maintains a killing schedule as an oncologist. Both aspects of his persona give him the necessary insights to write the book he has: a ‘biography of cancer’ that crackles with the suspense of a good mystery. The mystery, of course, involves finding the biology and genesis of cancer as a disease—the scientific pursuit—and ferreting out the medications that might promise a cure—the therapeutic pursuit. Both, though with a heavy emphasis on finding the cure, comprise what came to be known as The War on Cancer. It was a public-relations ploy designed to raise money to pay for research: money for trials for new procedures such as drugs for chemotherapy, surgery for tumor removal, X-rays to kill the offending cells; and also for the pure research into the fundamental biology of cancer cells to finally discover what, in fact, was causing cancer. What was the cellular malady that turned normally functioning cells into maniac proliferators of the tumors that were choking the body—always more and more bodies, it seemed—to death? And though the War on Cancer succeeded in raising an astonishing amount of money for cancer research and therapy, both from the U.S. Government and from private foundations, it never quite lived up to its promise. This is because the metaphor of war automatically implies taking aim and destroying an outside enemy—a virus or a bacterium that invades the body. But what Mukerjee leads us to in the end is the discovery, made gradually over the years, that the enemy is not something external to the human body. The enemy is within. Within the body. Deep within the cell. As Pogo once famously said, “we have met the enemy and it is us.”
            This is really the main thrust of Mukerjee’s book for me: cancer is not something that invades the body from without; cancer is a relentless and sometimes beautiful (Mukerjee actually uses this word) perversion of the most basic process of the human body: cell division or mitosis. The body must reproduce its cells constantly in order to live, to survive (blood cells are produced in our bone marrow at the astonishing rate of 300 billion per day!). And cancer hijacks this process in a way that makes it a virtual duplicate of ourselves. Here is how Mukerjee puts it early on:

To confront cancer is to encounter a parallel species, one perhaps more adapted to survival than even we are….This image—of cancer as our desperate, malevolent, contemporary doppelganger—is so haunting because it is at least partly true. A cancer cell is an astonishing perversion of the normal cell…Like the normal cell, the cancer cell relies on growth in the most basic, elemental sense: the division of one cell to form two (38).

What is even more mind-boggling is that cancer is not simply a fierce replica of our own ability to produce cells; the resultant cells also have the ability to evolve, to change in response to our attempts to kill or halt them. Mukerjee again:

Every generation of cancer cells creates a small number of cells that is genetically different from its parents. When a chemotherapeutic drug or the immune system attacks cancer, mutant clones that can resist the attack grow out. The fittest cancer cell survives (39).

So though we might want to think of cancer as simply a “dumb” result of basic chemical processes, we are forced to realize that cancer, like all “dumb” life, possesses a deep and deeply-ingrained intelligence. It recognizes attempts to extirpate it, bides its time, and works out strategies that allow it to survive and thrive (though here, as elsewhere in contemplating disease, I have never been able to quite figure out how “survival” fits a disease whose end game seems to be to destroy its host, and thereby, itself). Leukemia cells under attack from poisonous chemicals (combination chemotherapy), for example, seem to know enough to be able to migrate (metastasize) to the brain, where these chemicals are helpless to cross the blood-brain barrier. Mukerjee calls the brain, in this instance, “a natural ‘sanctuary’ for cancer within the body,” for a leukemia that seems almost conscious: “sensing an opportunity in that sanctuary, [it] had furtively climbed in, colonizing the one place that is fundamentally unreachable by chemotherapy” (147).
            Mukerjee gives us a detailed history of how cancer came to be recognized as a specific disease (as far back as ancient Egypt), and the many therapies developed to combat it: surgery (his description of mastectomies to extirpate breast cancer leaves us fascinated, and horrified at the more and more radical excisions that surgeons like William S. Halsted recommended in their mania to cut out every bit of a remaining cancer—all this mutilation, in the end, to no avail); chemotherapy, which found drugs almost by chance, by trial and error, and evolved to include higher and higher doses of more and more drugs, often leaving the patient half-dead from nausea (combination drug, X-ray and spinal-tap therapy was called, at St. Jude’s, “total hell”); to radiation therapy from higher and higher doses of X-rays, which themselves led to mutations; all in the effort to make the War on Cancer pay off with what was hopefully referred to as a “moon shot.” Mukerjee describes each of these phases in detail, often animated with case histories of some of his patients—the most memorable being Carla Reed. In her quest to stop her leukemia, we are told, Carla in 2004 entered “total hell,” visiting the clinic 66 times, with 58 blood tests, seven spinal taps, and several bone-marrow biopsies, in addition to multiple chemotherapies and radiations. Mukerjee cites a writer, a former nurse, describing a typical course of this “total therapy” at St. Jude’s hospital:

“From the time of his diagnosis, Eric’s illness had lasted 628 days. He had spent one quarter of these days either in a hospital bed or visiting the doctors. He had received more than 800 blood tests, numerous spinal and bone marrow taps, 30 X-rays, 120 biochemical tests, and more than 200 transfusions. No fewer than twenty doctors—hematologists, pulmonologists, neurologists, surgeons, specialists and so on—were involved in his treatment, not including the psychologist and a dozen nurses” (169).

But at least some of the patients suffering these agonies earned extensions of their lives. Mukerjee is harder on the results of the radical surgeries that were, and still, though rarely, are, the preferred treatment for breast cancer: 

Between 1891 and 1981, in the nearly 100 years of the radical mastectomy, an estimated 500,000 women underwent the procedure to “extirpate” cancer….Many were permanently disfigured; many perceived the surgery as a benediction…When radical surgery fell, an entire culture of surgery thus collapsed with it. The radical mastectomy is rarely, if ever, performed by surgeons today (201).

            The good news (if one can call it that) in Mukerjee’s story has to do with the long process of discovery about cancer biology, and the linking, finally, of these discoveries with therapies and drugs designed to match that knowledge. First, the discoveries (and it should be noted that all I can do here is provide a truncated sketch of what was and is a very complicated process). Beginning with a hunch by an Italian scientist named Boveri, biologists began to hone in on the mechanisms whereby the tightly regulated process of mitosis (cell division) in normal cells became chaotic in cancer cells. Bruce Ames, with his famous test on Salmonella bacteria in the late 1960s, found that a gene mutation would allow Salmonella to grow on sugar (galactose). He then saw that chemicals that scored high as mutagens (causing mutations) also tended to be carcinogens (causing cancer). Carcinogens, in short, had a common property: they could alter genes (mutation). One of the first carcinogens to be identified in the lab was a virus: the Rous Sarcoma Virus that could insert a viral gene into cells and make them cancerous. Though many scientists then became convinced that all cancer was caused by viruses, Howard Temin soon saw that it wasn’t the virus but what it had done that was key. By examining the Sarcoma virus, several scientists next found a specific gene, a single gene, that had done the damage. The gene was called src (pron. “sarc”), and it became one of a class called “oncogenes”—genes capable of causing cancer. It was then found how the src gene functioned: it encoded a protein whose main function was to modify other proteins by attaching a chemical, a phosphate group, to these proteins. Such protein enzymes were already known as kinases, and they acted as “molecular master switches,” often switching a cell “on” which then turned another “on” until with many cells turned “on” the target cell switched from a non-dividing to a dividing state, all under tight control. Src, by contrast, was a kinase on hyperdrive, turning normal cells into endlessly-dividing machines, the hallmark of cancer.
            One final mystery remained: how did src evolve into an oncogene? Two scientists at UCFS (University of California at San Francisco), J. Michael Bishop and Harold Varmus in the 1970s began to study src and came up with the solution. They discovered that src was not some foreign gene that had infiltrated normal cells; src was everywhere in normal cells from ducks to mice to fish to humans. But these normal src genes were not identical to the ones in the Rous virus. They were kinases, but not hyperactive ones; they were tightly regulated to act only during normal cell division. In short, they did not have the mutation that the viral genes had that made them permanently activated. Out of this, Varmus and Bishop developed a theory: normal src was a precursor to the cancer-causing viral src. It was a normal part of the cell, endogenous to the cell, that needed a mutation to turn it into a cancer-causer, an oncogene. Here is how Mukerjee sums up this vital discovery and insight:

The crucial implication of the Varmus and Bishop experiments was that a precursor of a cancer-causing gene—the “proto-oncogene,” as Bishop and Varmus called it—was a normal cellular gene. Mutations induced by chemicals or X-rays caused cancer not by “inserting” foreign genes into cells, but by activating such endogenous proto-oncogenes (362).

Mukerjee goes on to put this in historical perspective: “The Greeks had been prescient with their name for cancer, onkos (meaning load or burden). Cancer was intrinsically ‘loaded’ in our genome, awaiting activation” (362)—often by an environmental insult like cigarette smoke or radiation. He also cites a wonderful image from Harold Varmus’ speech when he and Bishop received the Nobel Prize in 1989, revealing the cancer cell to be “like Grendel (the monster in Beowulf), a distorted version of our normal selves” (363). 
            With many modifications and extensions (further research found that there were two “flavors” of cancer genes: positive ones like src that drive cell growth into hyperactivity [Bishop compared these to a “jammed accelerator”]; and negative genes, like Rb, that normally suppress cell division, but, with mutations, lose their suppressing function so that cell division goes on unhindered [as in “brakes” that don’t work]), this has become the dominant theory in cancer research. It has also, finally, led to targeted therapies that have allowed drugs to be specifically targeted to a specific kind of cancer-causing gene. Among these new targeted drugs was one called Herceptin, which targeted a breast-cancer oncogene labeled Her-2. In 1991, a patient named Bradfield was given Herceptin in combination with an older chemical, cisplatin, designed to kill breast cancer cells. Two months into her therapy, Bradfield’s neck tumor disappeared, and after 18 months of therapy, she was in full remission and survives today. Another is known as Gleevec, a drug developed by Ciba-Geigy (now Novartis) for Chronic Myeloid Leukemia or CML. Though it at first refused to spend money for drug trials for Gleevec (not enough patients would use it for Novartis to make money), Novartis finally relented and agreed to a few trials. As of 2009, CML patients treated with Gleevec were surviving an average of thirty years after diagnosis, proving that targeted cancer therapy really does work.
            But lest we forget, Mukerjee reminds us that cancer is the wiliest of all diseases. Soon, doctors were noticing that some cancers were demonstrating Gleevec resistance (similar to bacteria that become resistant to antibiotics). It is worth trying to describe this Gleevec resistance to demonstrate the phenomenal intelligence Mukerjee is at pains to make us see. I have already described cancers that migrate into the brain to escape drugs; but there is another, a cancer cell mutation that, almost fiendishly, activates the cellular pumps that normally rid the cell of natural poisons—to get rid of the chemotherapy drugs! Gleevec-resistant cells did something more astonishing: they acquired mutations that precisely altered the structure of the leukemia-causing oncogene Bcr-abl, “creating a protein still able to drive the growth of leukemia but no longer capable of binding to the drug” (442). That is, where normally Gleevec slips precisely into a “narrow, wedgelike cleft in the center of Bcr-abl” to literally pierce its heart and kill it, the mutations altered this molecular “heart” so that the drug could no longer penetrate it (it no longer fit), thus making the mutated cancer immune. As Mukerjee puts it, “To escape targeted therapy, cancer had changed the target” (442).
            I don’t know about you, but this kind of (apparently) non-cognitive intelligence, even in a form that most of us would not hesitate to call “evil,” leaves me gasping for words. It does the same to Mukerjee, though he is quite adept at providing brilliant phrases and sentences in this captivating book. But let me give you some of the thoughts that it has evoked in me, and then end with Mukerjee again. It occurred to me today that if cancer is a perversion of our normal selves, our normal processes, as Mukerjee says, then we might say that it is a perversion because it is uniquely and brilliantly concerned only with its own survival. I have always had a problem with those who insist that the only thing that matters in life is survival. Because if survival is the end game, then cancer does it even better than we do. Cancer, as we see countless times in Mukerjee’s story, is the ultimate survivor. He even says, at one point, “Some day, if a cancer succeeds (in finding immortality), it will produce a far more perfect being than its host—imbued with both immortality and the drive to proliferate” (459). Cancer, that is, is better than we humans are at survival. The question becomes, is that all we are? Are we simply here to survive? Designed to outlast everything and everyone else? If that is the case, then we are on the “right” path, moving ourselves and the planet towards destruction as we do so. In this way, we are indeed just like cancer. Our monomaniacal drive for survival moves inevitably towards the destruction of our host, the only home we know. Which is why it is here that I part company with the survivalists. Though it is difficult to say how, and to what degree, humans, human being, is/are more than simply the number of years we survive or the sum of those who survive. Human being involves others, involves all other being. It matters to humans if others survive. It must matter, as we know from the cases where such mattering is thrust aside—and we see the horrors that have marked our century, and the horrors that may yet be coming if we do nothing but look out for our own survival, either as individuals, as a country, as a continent, as a hemisphere. We can easily predict the outcome of that kind of survivalism; we are already getting a taste of it today. No. Human being, again, is more than mere survival, and that is how we differ from cancer.
            So though Mukerjee ends with a reiteration of his overall theme (“Cancer is a flaw in our growth, but this flaw is deeply entrenched in ourselves. We can rid ourselves of cancer, then, only as much as we can rid ourselves of the processes in our physiology that depend on growth—aging, regeneration, healing, reproduction”), when he says this he is speaking strictly as a scientist, as someone looking at cancer, at humans, as strictly physical processes. Though I have nothing but admiration for his ability to do this, and for his ability to make us see how elegant and intertwined this dread disease is with our own fate, I part company with him here. For here, after all, is where we as humans have the capacity to look deeply into this flaw in our being, contemplate it, even come to terms with it, perhaps accept it (at least in the abstract)—and in so doing, comprehend it. We, that is, comprehend it; it does not comprehend us, other than as obstacles to its survival. And that makes all the difference.

Lawrence DiStasi

Thursday, July 28, 2016

Death of a Comrade

I’m not sure what I want to write about it, about the death yesterday of my old friend, Gian; except that I feel the need to write something. It brings to mind the opening piece I wrote for Una Storia Segreta (Heyday: 2001), which I titled after a line in Prospero Cecconi’s notebook, “Morto il camerata Protto.” Cecconi was referring to the death of his friend Giuseppe Protto, when both were domestic internees (yes, so-called ‘enemy aliens’ of Italian descent were interned during that war after being judged “potentially dangerous”) imprisoned at Camp Forrest, TN during World War II. Cecconi, too, felt the need to write something in his clandestine notebook, though all he could bring himself to write was that simple line, Morto il camerata Protto; ‘my comrade Protto is dead.’ It was enough; years later, we can feel the pain, the loss, the loneliness in that lone line.
            Now, my friend Gian is dead. È morto lui. And though I have more time and more skill with language to write volumes about it, there really is very little to add. He’s dead. My friend Gian, whom I’ve known since the mid 1970s, and with whom I’ve laughed and joked and written and studied and cooked and celebrated our common heritage—we organized a little group we called the circolo in the early 1990s; and would gather once a month to cook together (they were sumptuous feasts) and laugh together and reminisce about our Italian parents and childhoods and the foods we used to eat—my friend Gianni is gone.
            At first I took it rather philosophically. Yes, I knew he was ill and in hospital where I got to see him a week or two ago. Yes, I knew he had been placed on nothing but palliative care and was certain to slip into oblivion sooner rather than later. Yes, I had been expecting the call for days. And yes, I have more or less accepted the fact of death, the fact that we all die, that nothing is more certain than the death which is a necessity of our existence and often a blessing. But when it came, something internal shifted. I didn’t even notice it at first. I busied myself with finding some mementos I could contribute to an expected memorial service, some of his drawings, some writings about him and his vintage kitchen and vintage 1940s décor and vintage humor that kept me busy most of the afternoon yesterday. But in the night I began to realize that I was grieving, albeit not in the way we think of as grieving: no tears, no depression to speak of, no laments about the futility of life or the too-early death of this life, or how I would miss him. No. There was mainly this sense of drift. I suddenly felt unmoored. It was as if an anchor in my very life had come loose—but not a literal anchor; some inner anchor that was more like a void or an eraser that had left me, or part of me, vacant. Adrift. Easily blown away. This happens more, perhaps, when one is older and the friends and relatives that remain get fewer and farther between. I don’t know. All I know is that Gianni was my close friend, someone I could always count on to be in my imagined gallery of people to speak to or places like his unique house and kitchen to go to, to sit and drink a companionable glass of wine and complain or joke or laugh or cogitate about the follies of the world with. For instance, there was the time I had been on my zen walk and was heading home through Berkeley, and simply dropped in for a quick rest and a cup of coffee and he immediately saw something different in me, some spiritual light in me that no one else would have seen much less valued.
            And now that real space, that imagined space is gone. Empty. The weight of it, that’s what strikes me most. The weight it provided in my life, the ballast that kept things secure and at least partly known—which is what a friend is, what a relative is, a known weight or quantity to keep one firmly in place—was gone; is gone. He is no more. Though I can still conjure up his looks and his speech patterns and his laughter, the man himself is no more. The weight of him. The solidity of him. The actual belly and blood of him.
            And how peculiar it really is, this sense of another. We can imagine it sometimes. We can still find the outlines in our image drawer in the mind. But we know, after death, that it is only an image. It has no flesh to it. No bones to it. No scent or feel to it. No response to it because it can’t talk back. It can’t provide the answering weight of itself to our own presence because it has no presence anymore. And presence—what a person actually is in the flesh—though it’s almost impossible to express, is something we know. Know without any reflection or reason that that’s what a person is, really. That presence. And it is not captured in drawings or photos or films or any medium but itself. Part of it can be captured, the part that is analogous to what we can conjure on our mental screens. But the real fullness of it, the living presence of another human or animal or tree or flower—that is never, cannot be ever captured in any of our media. We are fooled that it is. We are fooled into thinking that we really know those we see on TV or on our computer screens or in our smartphones. We don’t. All we know is shadows, poor bereft shadows that have no weight, no depth, no life. Which is what we’re left with when someone dies. Shadows. I still have the shadow of Gianni. But his presence, his weight, his laugh, his life—that is gone forever. And in some terrible way, it makes me lighter, more fleeting, more adrift.
            That is what we grieve. We grieve, I grieve the loss of that unique, indispensable, never-to-be-repeated presence that was Gian Banchero. That will never come again. So simple: È morto lui. So commonplace: È morto lui. And yet so deeply, unfathomably vacant, empty, weightless, gone.

Lawrence DiStasi

Tuesday, July 26, 2016

The Gene and its Discontents

Notwithstanding its beautifully-rendered history of how scientists finally, after 2500 years of speculation, finally discovered and named the “gene” as the mechanism of heredity (Darwin had no idea of this mechanism, speculating about tiny things he called “gemmules”) for me, the most fascinating parts of The Gene, by Siddhartha Mukerjee (Scribner: 2016), are the materials on Eugenics. Originated by Francis Galton (Darwin’s cousin) in the late 19th century, “eugenics” refers to the idea that humans should try to select the “best” genes from among human populations and selectively advance those “good” genes and eliminate the “bad” ones to produce a race of “perfect” humans. In a lecture at the London School of Economics in 1904, Galton proposed that Eugenics “had to be introduced to the national consciousness like a new religion.” Arguing that it was always better to be healthy than sick, and ‘good’ rather than ‘bad’ specimens of their kind, he proposed that mankind should be engaged in selectively breeding the best, the good, the strong. As Mukerjee quotes him: “If unsuitable marriages from the eugenic point of view were banned socially…very few would be made” (p. 73), the mechanism to promote ‘suitable marriages’ being a kind of golden studbook from which the “best” men and women could be chosen to breed their optimal offspring. No less a figure than H.G. Wells agreed with Galton, as did many others in England who even then were expressing fear about the inferior working classes out-breeding the better classes. Galton founded the Eugenics Review in 1909 to further advance his ideas, but died the next year before he could really get eugenics going in England. But other countries like Germany and the United States were already taking steps to follow Galton’s lead. Indeed, at the first International Conference on Eugenics that was held in London in 1912, one of the main presenters was an American named Bleecker Van Wagenen. Van Wagenen spoke enthusiastically about efforts already underway in the United States to eliminate “defective strains” (of humans) in America, one of which involved confinement centers—called “colonies”—for the genetically unfit. These were the target of committees formed to consider the sterilization of ‘unfit’ humans such as epileptics, criminals, deaf-mutes, and those with various ‘defects’ of the eyes, bones, and mind (schizophrenics, manic depressives, the generally insane.) As Van Wagenen suggested, 

Nearly ten percent of the total population…are of inferior blood, and they are totally unfitted to become the parents of useful citizens…In eight of the states of the Union, there are laws, authorizing or requiring sterilization (77).

            Van Wagenen was not kidding. The United States continued its misreading of Darwin and its enthusiasm for sterilizing the  ‘unfit’ well into the 20th century, and it was not just the lunatic fringe that was involved. Mukerjee cites a famous case that came before the Supreme Court in 1927, Buck v. Bell. This case concerned one Carrie Buck, a Charlottesville, Virginia woman whose mother, Emma Buck, had been placed in the Virginia State Colony for Epileptics and the Feebleminded after she was accused of immorality, prostitution and having syphilis. In fact, Emma Buck was simply a poor white woman with three children who had been abandoned by her husband. No matter; she was judged ‘unfit’ and, with her mother confined, little Carrie was placed in a foster home, was removed from school by her foster parents to work, and at age 17 became pregnant. Her foster parents, John and Alice Dobbs, then had her committed to the same State Colony for the Feebleminded on the grounds of feeblemindedness and promiscuity, where Carrie gave birth in March 1924 to a daughter, Vivian. But having been declared mentally incompetent, Carrie was unable to stop the Dobbs from adopting her baby. (One reason the Dobbs may have wanted the baby was that it later turned out that Carrie’s pregnancy was the result of a rape by the Dobbs’s  nephew). Carrie was quickly scheduled to be sterilized, and the Supreme Court case of Buck v. Bell was brought to test the sterilization law—the 1924 Virginia Sterilization Act—to which Carrie Buck was subject, being already in a state institution for the feebleminded. Astonishingly, with the ‘great’ Oliver Wendell Holmes presiding, the Supreme Court voted 8 to 1 that the Sterilization Act did not violate the U.S. Constitution’s due process provisions—since Carrie Buck had been given a hearing, and since she was already confined to a state institution. Mukerjee cites some of the now-infamous ruling by Holmes:

It is better for all the world, if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind. The principle that sustains compulsory vaccination is broad enough to cover cutting the Fallopian tubes…Three generations of imbeciles is enough (83-4).

In accordance with the Supreme Court’s ruling, on October 19, 1927, Carrie Buck was sterilized by tubal ligation. The fact that her daughter Vivian—the ‘third generation imbecile’ Holmes referred to—had performed adequately in the school she attended, being of decidedly average intelligence, did not save her; nor, for that matter, did it save Carrie’s sister Doris, who was also sterilized, without her knowledge, when she had her appendix removed. After this, sterilization was free to spread in the United States; in 1927, for instance, the great state of Indiana revised an earlier sterilization law to cover “confirmed criminals, idiots, imbeciles and rapists,” with other states following suit. Pre-marital genetic fitness tests became widespread, as did Better Babies contests at State Fairs. With the help of practical ‘genetics,’ America was out to produce a race of perfect humans fitted to its already ‘perfect’ political system and ‘perfect’ society.
            The logical next step in the Eugenics movement came, of course, in Nazi Germany. In 1933, Mukerjee tells us, the Nazis enacted the Law for the Prevention of Genetically Diseased Offspring, aka the Sterilization Law. Its premises were borrowed directly from America’s own program: “Anyone suffering from a hereditary disease can be sterilized by a surgical operation,” the diseases to include mental deficiency, schizophrenia, epilepsy, depression, blindness, deafness, and other serious deformities (121). Any cases in dispute were referred to a Eugenics Court, whose rulings allowed for no appeal. With films like Das Erbe (the Inheritance, 1935) propagandizing in its favor, the law became a grim model of  efficiency, with 5,000 adults being sterilized each month by 1934. And as with its other better-known programs, the Nazis moved smoothly and efficiently to the next step—euthanasia. A Scientific Registry of Serious Hereditary and Congenital Illnesses was set up, devoted to euthanizing (i.e. killing) defectives permanently to ‘purify’ the gene pool. The Nazis coined a macabre euphemism to justify all this, by perverting Socrates’ famous dictum about “the unexamined life not being worth living” into its macabre opposite: the euthanized were characterized as having lebesunwertes Leben, ‘lives unworthy of living.’ Though at first the targets were limited to children under three, soon the net was extended to adolescents, then juvenile delinquents, and finally in October 1939 to adults, with Jews at first conveniently labeled “genetically sick.” Typically, the Nazis set aside a villa, No. 4 Tiergartenstrasse in Berlin, as the official HQ of their euthanasia program, a place eventually known as Aktion T4, for its street address. Mukerjee at this point gives us one of his trademark elegant sentences: 

But it is impossible to separate this apprenticeship in savagery from its fully mature incarnation; it was in this kindergarten of eugenic barbarism that the Nazis learned the alphabets of their trade….The dehumanization of the mentally ill and physically disabled (“they cannot think or act like us”) was a warm-up act to the dehumanization of Jews (“they do not think or act like us.”) 125.  (my emphasis).

            There is other fascinating material in this altogether fascinating book, but I will leave most of that to other readers to discover. What I should like to stress is what Mukerjee himself stresses about genes, the genetic code, and eugenics. First, that genes, contrary to common perceptions, are not blueprints that form every element of an organism. Rather, they are like recipes—in that, just as recipes provide instructions about the process of cooking something, similarly, genes provide instructions about the process of building an organism. And as with a recipe, lots of chance or even intentional events can produce all sorts of variants. And, of course, the chance event par excellence, is the mutation. The problem is that humans, and especially humans who get seduced by the prospect of either eliminating “bad” mutations, or selecting for the “best” ones, misinterpret what mutations are and how they function in evolution. Citing the realization of Dr. Victor McKusick, Mukerjee makes the critical distinction that he wants everyone to grasp—that a mutation is a “statistical entity, not a pathological or moral one.” A mutation doesn’t imply something bad, like disease, nor even a gain or loss of function:

In a formal sense, a mutation is defined only by its deviation from the norm (the opposite of “mutant” is not “normal” but “wild type”—i.e. the type or variant found more commonly in the wild). A mutation is thus a statistical, rather than normative, concept. A tall man parachuted into a nation of dwarfs is a mutant, as is a blond child born in a country of brunettes—and both are “mutants” in precisely the same sense that a boy with Marfan syndrome is a mutant among non-Marfan, i.e., “normal,” children (264).

This distinction is critical, especially as regards the benighted attempts to create perfect humans or a race of normal humans. What we call “normal” is merely that which seems to be fitted to a given time, place, and conditions. To try to select for this “normalcy” is to completely misunderstand what genetics and evolution tell us. The “fittest” are not those who have won some sort of evolutionary or genetic race that is good for all time. They are simply those who may have turned out to be well-adapted to a given set of environmental and social circumstances. The worst conclusion one could draw from such “fitness” would be a) to decide to select only for those adaptations and exclude all others; or b) to try to interfere in genomes and eliminate all genetic variants in the vain hope that humans could be bred free of all illness or ‘unfitness.’ Conditions inevitably change. We have no idea what conditions might eventuate that might require some of the variants that we would like to prune out of existence—and prune is the accurate word here, leading us, as it does, to our modern mania to favor certain varieties of, say, apples or corn or wheat, while completely obviating the thousands of varieties that have evolved over centuries. This is a kind of ‘vegetable eugenics’ that many botanists have warned could leave the world without staple crops in the event of a pathogen that wipes out the now-dominant varieties. In short, a diverse gene pool is an absolute necessity for evolution to proceed.
            Yet despite the disfavor that eugenics has encountered in our time, the kind of thinking that fosters it is far from dead. Mukerjee cites a case from 1969, where a woman named Hetty Park gave birth to a daughter with polycystic kidney disease, leading to the child’s rapid death. Park’s obstetrician thereupon assured her that the disease was not genetic, and that there was no reason she should not have another healthy child. Park conceived again, but sadly the same result ensued; whereupon Park sued her obstetrician, for bad advice, and won. The court ruled that “the right of a child to be born free of [genetic] anomalies is a fundamental right.” Mukerjee points out that “this was eugenics reincarnated.” In other words, the court had ratified an expectation that the particular genetic mutation that caused harm to the Park family violated their rights—in effect, that that mutation should not exist. In the coming world of gene manipulation, we can expect that many mutations that now are classed as “abnormal” will be similarly classified and excised from existence. But as Mukerjee reminds us again and again, if we can expect anything, we can expect that conditions will certainly change. What appears “normal” now may one day be considered to have had only temporary value, suited to a very specific time and place. As Mukerjee notes at the end of his book, “Normalcy is the antithesis of evolution” (481). That is, though we have come to distrust and despise “mutations” that compromise what we consider ‘normal,’ evolution absolutely requires them, requires a gene pool that is as varied and diverse as it can be. Mutations are the lifeblood of such diversity, the bank on which evolution relies to adapt to always new circumstances. And equally important, evolution does not proceed according to human wants or needs or the wants or needs of any organism. Evolution proceeds according to what works, what is adaptable to a given circumstance at a given point in time. There is no good or bad adaptation. There is no good or bad mutation. There is no “normal” much less “best” genome or genetic code. No one can ever know what might be needed. So before humans go about eliminating that which appears negative or useless in any given era, they should think twice about eliminating precisely that which might one day prove salvational. Here is how Mukerjee puts it towards the end of his book:

“Gene editing,” the stem cell biologist George Daley noted, “raises the most fundamental issues about how we are going to view our humanity in the future and whether we are going to take the dramatic step of modifying our own germ line and in a sense take control of our genetic destiny, which raises enormous perils for humanity” (479).

Siddartha Mukerjee employs the perils of eugenics to serve as an object lesson that those ‘enormous perils’ of humans ‘modifying our own germ line’ are perils not just for humans, but for all the life on this planet.

Lawrence DiStasi

Friday, July 22, 2016


My title, as many of you will recognize, is a variant of the word “NewSpeak” from George Orwell’s dystopian novel, 1984. Whether one should credit Donald Trump with coining a new form of speech may be questionable, but watching his performance last night, I was struck not so much by the laughable misrepresentation of almost all his alleged “facts” (if you want a good rundown of how each of Trump’s ‘factoids’ were grossly exaggerated, de-contextualized or outright lied about, see the Washington Post piece here:, but by his speech patterns. (In case you’ve forgotten, one of the reasons fact-checking doesn’t matter much for a Trump audience has to do with their ‘stone-age brains.’ Briefly, most people employ a quick, instinctive estimate, done in milliseconds, of a politician’s looks and/or manner, and completely bypass the reasoning process behind the information he delivers. This accords with the stone-age brains most of us still work with in interpersonal relations.)
            So, for now, let’s bypass the howlers Trump spouted in his overly long but factually empty speech, and attend instead to the patterns of rhetoric he used. To begin with, the man seems to be mostly driven—both in his domestic critiques and his foreign ones—by the notion of “getting a good deal.” This would figure, since his life seems to have been devoted to deal-making in the high-risk world of (mostly) Manhattan real estate. It is a world dominated by con men and hucksters who are always out to screw the naïve or the unwary. The New Yorker, therefore, must always be on his guard to make sure he’s not being screwed. This applies to all New Yorkers in all areas of life, but especially to those engaged in the dog-eat-dog world of real estate developing. Accordingly, Donald Trump’s rhetoric is full of critiques of his predecessors like Hillary and Obama and Bill Clinton for “not getting a good deal.” In his eyes, they gave away the store in the Iran nuclear deal; they gave away the store in Libya and Syria and Russia and China and especially in trade deals like NAFTA and the upcoming TPP. In short, previous political leaders succumbed to the cardinal sin in Trump’s world: they didn’t negotiate hard or cleverly enough; weren’t willing enough to play hardball; weren’t willing enough to talk tough and walk away and threaten and harangue. Now, of course, Trump has no way of knowing this; he wasn’t there; has never been engaged in any diplomatic activity or anything remotely political; and certainly is not about to consider the way that the United States has totally dominated and exploited almost every relationship it has entered in the post-World War II years. No. All he’s willing to bray about is how weak the nation has become, i.e. how it can no longer dictate the terms of every agreement due to its position as the biggest, baddest, most powerful nation on the globe. So he claims that he, the great real estate wheeler-dealer, will be able to make ‘better deals’—even, presumably, with those shirkers at home who want a free lunch.  
            And that brings us to the second noticeable rhetorical pattern. Trump never explains exactly how he’s going to accomplish all this. All he does is, first, exaggerate the problem—we’re besieged by criminals and loafers domestically and by terrorists from abroad, our cities are falling apart, our industry has all left for cheaper shores due to bad trade deals, cops are being murdered at the highest rate ever—and then assert that he’s the one who, with his superior deal-making ability, will fix the problem. Crime will end. Immigration will end. Terrorism will end. Globalization will end. Inner-city poverty will end. And he, Donald Trump, will end it.
            But how? These are complex, difficult problems that Republicans and Democrats alike have been promising to solve for decades. Not for Trump. The language is simple, the problems are simple, the solution is simple: Put Trump in Charge. And soon, trillions of dollars will be pouring into the nation’s coffers, taxes will be far lower saving everyone more trillions, roads will be built, infrastructure will be modernized, onerous regulations will disappear freeing up our energy sources (never mind the pollution or global warming) and pouring in even more trillions, and American Will Be Great Again.
            It is simple. And it is simpleminded. And the stone-age brains crowding the Republican Convention could not cheer loud enough or stomp hard enough or chant USA! USA! USA! often enough to roar their approval. Their devotion, even. Their lord and savior was saying it. He was saying it with confidence and certainty and with his jaw jutting out like some latter day Benito Mussolini, and they were ecstatic (as Mussolini’s crowds often were). He would talk tough. He would be tough. He would just take those over-educated fancy-nancy diplomats and bureaucrats by the throat, saying ‘fuck your reasoning and diplomacy and equity,’ and force them to give him a good deal. And if they didn’t, he’d bomb the shit out of them.
            And that’s it. After the longest speech in convention history, Donald Trump managed to say virtually nothing but the same posturing, simple-minded crap he’s been spouting throughout his primary campaign. Leaving the rest of us, the ones searching for some sort of program or plan or logic to his meandering speech, to wonder: how can they swallow this infantile pap? How can they not see that this guy has no capacity for any thought that’s longer than a sentence or two? Did you notice that? He never stayed with one subject for any sustained length of time: it was all quick cuts, as in a commercial. Crime in the streets. Shooting cops. Terrorists. Hillary and Libya, Iraq, Syria, Egypt. NAFTA. China. Back to high unemployment. Obama care. It reminded me of what was revealed in Jane Mayer’s recent article in the New Yorker where she interviewed Trump’s ghostwriter Tony Schwartz (he wrote The Art of the Deal for Trump)—i.e. that Trump had no capacity whatever to focus on anything for longer than a minute or two. Trying to interview Trump, said Schwartz, was like trying to interview a chimp with ADHD (my metaphor). The man had no capacity to concentrate at all, so Schwartz ended up following Trump around, listening in on phone calls and interactions and inspections, to scare up material for the book. The other thing Schwartz noticed—after Trump threatened him with a lawsuit and demanded that he return all the royalties Schwartz had earned from the bestseller—is that Trump’s famously thin skin demands that he instantly attack anyone who criticizes him. We all saw that in this Spring’s Republican debates. What Schwartz reminds us is how frightening this quality would be in a President: 

“The fact that Trump would take time out of convention week to worry about a critic is evidence to me not only of how thin-skinned he is, but also of how misplaced his priorities are,” Schwartz wrote. He added, “It is axiomatic that when Trump feels attacked, he will strike back. That’s precisely what’s so frightening about his becoming president.” (Jane Mayer, “Donald Trump Threatens the Ghostwriter of ‘The Art of the Deal’”, New Yorker, July 20, 2016.)

            Donald Trump, in short, gave one of the most consistently alarmist acceptance speeches in American political history last night. But what we should truly be alarmed about is ever ceding the enormous responsibility and power of the American presidency to a man who is so ill-equipped—emotionally, mentally, and morally—to handle it. For if, with the help of a gang of speechwriters, he is unable or unwilling to put together a cogent argument that at least attempts to fill in some of the missing spaces of TrumpSpeak, then every American with an ounce of sense should be terrified about how those missing spaces might eventually take some reckless, cataclysmic shape.

Lawrence DiStasi

Friday, July 8, 2016

From Chilcot to ISIS

The bombshell in Britain in recent days has been the long-awaited (seven years in the making) report by Sir John Chilcot condemning Britain’s role in the 2003 invasion of Iraq. Most Britons, like most Americans, have long since concluded that the invasion was a disaster. But though the report fails to assign legal culpability (which many Britons who lost loved ones in the invasion hope to get), it does roast former prime minister Tony Blair pretty thoroughly. It says, in part, that his

 “judgements about the severity of the threat posed by Iraq’s weapons of mass destruction—WMD—were presented with a certainty that was not justified” and “Despite explicit warnings, the consequences of the invasion were underestimated….It is now clear that policy on Iraq was made on the basis of flawed intelligence and assessments. They were not challenged, and they should have been.”

It also explicitly condemns Blair (known in Britain as ‘Bush’s poodle’), for blindly following the lead of President Bush, citing a letter Blair wrote in July 2002 promising that “I will be with you whatever…” This constituted Blair’s only success according to the report, i.e., successfully appeasing George W. Bush.
            That the report took seven years to appear is in part attributed (by a 2003 report in London’s Independent cited in Alternet’s account of the Chilcot release) to a “fierce battle” waged by the U.S. State Department and the White House as early as 2003 to block release of the report because it allegedly contained “classified information.” Whether the release of the report in 2003 would have saved lives, either British or Iraqi, is not known, but it might at least have caused some re-evaluation of the Bush administration’s rationale for the invasion, which in turn might have led to Bush’s defeat in the 2004 election. Instead, of course, we got four more years of the worst presidency in history.
            If this were the end of it, the Iraq war blunder would still count as a horror costing millions of lives, but not as grave or extended a one as what it subsequently turned out to be. For the current plague of ISIS attacks in Iraq, Syria and now throughout the Middle East and the world, stems directly from the hubris and secrecy of the Bush Administration during that time. This is made clear in a recent (first aired May 17, and again, when I saw it, on July 5) Frontline documentary: The Secret History of ISIS  ( What the documentary reveals is how ISIS was able to thrive and grow through a series of blunders—mainly driven by “optics”—regarding its first leader, one Abu Musab al Zarqawi. We learn that Zarqawi was known to the CIA even before the invasion in 2003: according to Nada Bakos, a CIA analyst charged with looking into his background, Zarqawi was a tough kid who grew up in a tough neighborhood in Jordan, one who appeared on his way to a lifetime in prison as a thug, pimp, and general hardass covered with tattoos. But one stint in prison radically changed him: he became a jihadist, a holy warrior; and to demonstrate his zeal, he actually removed his tattoos by using a razor blade to cut off his outer layer of skin. After that, he left Jordan for Kandahar in Afghanistan, determined to join up with Osama bin Laden. But bin Laden ignored this wannabe from Jordan and in 2002, Zarqawi saw a chance to strike out on his own, this time in Iraq. He set himself up near the Iran/Iraq border and began building his den of crazies. Fortunately, the CIA had an informant in Zarqawi’s camp, saw him as a definite threat in the event of an invasion, particularly as Zarqawi’s group was apparently trying to build chemical and biological weapons. CIA analyst Sam Faddis, assigned to the case, therefore formed a plan to take him out, and forwarded the attack plan to the White House for approval.
            But the White House, in the person of VP Dick Cheney and his aide Scooter Libby, wanted no part of the takeout, especially before the big invasion, so Cheney and Libby drove to the CIA to undermine the CIA’s information. From their aggressive questioning, it was clear that the White House had more in mind than simply worry about a strike that might pre-empt their war plans. They had cocked up a narrative concerning Saddam Hussein’s al Quaeda connection and involvement in 9/11 as a big part of their casus belli. And when the CIA said there was no connection, it was clear that Cheney/Libby badly wanted there to be one. This would eventually lead to Colin Powell’s memorable speech at the UN, in which the Secretary of State besmirched his reputation by accepting the White House’s script—which he, uncharacteristically, read verbatim at the UN. And though the White House appeared to follow protocol by sending the script to the CIA for vetting, Nada Bakos testifies in the documentary that the White House simply ignored the CIA’s corrections and stayed with their required script. As Colin Powell authoritatively put it: 

“…there’s a sinister nexus between Iraq and terrorist networks. Iraq today harbors a deadly terrorist network headed by Abu Musab al Zarqawi, an associate and collaborator of Osama bin Laden and his lieutenants.”

When confronted in the Frontline documentary about this clear fabrication in his UN speech, Colin Powell claims that his memory now is vague, but insists that his references to Zarqawi were unimportant to his general case. The truth is that a full seven minutes of the Powell speech were devoted to Zarqawi who is mentioned no less than 21 times, thus firmly connecting Iraq and Saddam to the terrorist network that had already attacked the United States on 9/11. Not incidentally, Powell’s speech also transformed Zarqawi into a major terrorist directing a worldwide terror organization. It is almost as if Colin Powell created Zarqawi, and ISIS, at that very moment.
            From this point, everything that the United States did played into Zarqawi’s hands. First came shock and awe, tearing apart a nation. Then came Paul Bremer, the moron placed in charge of the Iraq Provisional Authority, who not only dismantled the entire governmental structure of Iraq, but then fired the entire military, leaving some quarter of a million experienced soldiers without a job or means of livelihood. Zarqawi wasted no time in recruiting thousands of these Sunni ex-soldiers, and they today form a major portion of the ISIS forces. Even General David Petraeus testifies in the documentary that the effect of Bremer’s move was “devastating” and planted the seeds of the insurgency. Zarqawi’s attacks began almost immediately, with devastating car bombs that turned Baghdad and the rest of Iraq into a charnel house of raging sectarian war. That he planned to do this was clear from a letter Zarqawi wrote laying out his plans. He wanted Iraq torn apart by sectarian conflict, he wrote, that would leave it vulnerable to his more ambitious plans to create a caliphate. Bombing the UN headquarters added to the chaos because both the UN and all the NGO organizations that might have provided some protection and order, immediately fled Iraq.
            It was at this point that Nada Bakos sent a briefing document to the White House saying specifically that Zarqawi was responsible for the major attacks and was looking to foment a civil war. It got to Scooter Libby, who then called Bakos and summoned her to his office, clearly to pressure her to change her main conclusion, i.e., that there was an insurgency in Iraq that threatened the entire American project. It was that word, insurgency, that the White House found toxic. It implied that the Iraqi people weren’t completely overjoyed about the American invasion. Again, it was the optics that the White House wanted to change. So the White House, especially Donald Rumsfeld in press conferences, ridiculed news reports that only focused on the alleged chaos—which they vociferously denied. The denial, of course, made it impossible to combat the insurgency, which was allowed to grow unhindered.  
            Zarqawi made the most of such denial. He instituted a reign of terror that had never been seen before, beheading American Nicholas Berg on camera to establish his credentials as a slaughterer of epic proportions (one of his monikers was the Sheikh of the Slaughterers). And though even Osama bin Laden tried to slow him down, objecting to the killing of muslims by other muslims, Zarqawi’s response was to blow up the most sacred Shia site in Iraq, the Golden Dome of Samara. This was the final straw for Shias, and all-out sectarian war ensued—exactly what Zarqawi wanted. Shortly thereafter, he showed himself on camera firing an American automatic weapon to emphasize his power and ruthlessness, as well as  his plan to set up an Islamic state as the first step in forming a global caliphate.
            We know the rest. Even though Abu musab al Zarqawi was finally killed in a drone strike, his fiendish methods and plans have been continued by his even more ruthless successor, Abu Bakr al Baghdadi. What’s most disturbing is that all of this—the destruction of Iraq in the first place, the refusal to take out Zarqawi when the CIA wanted to, the idiocy of disbanding and setting adrift a quarter million potential fighters from the former Iraqi army, the mania to sanitize and justify the whole bit of lunacy in the first place—all of it might have been prevented if saner heads had prevailed. But of course, that is what marks the late lamented Bush Administration: lunacy and hubris (and an optimistic savagery) from top to bottom. At this point—with so many lives lost or ruined, and the Middle East in unprecedented chaos—all we can do is hope we shall never see its like again.

Lawrence DiStasi