Friday, August 5, 2016

Our Malevolent Doppelganger


When I first heard of Siddartha Mukerjee’s prize-winning book, The Emperor of All Maladies, I decided I wanted no part of it. Why read a book about cancer when we already hear far too much about this insidious horror of a disease? But then I read Mukerjee’s second book, The Gene, and realized that he is the latest in that enduring trend of physicians (William Carlos Williams, Lewis Thomas, Abraham Verghese) who are also brilliantly accomplished writers. So I decided to take a look at The Emperor of All Maladies (Scribner: 2010). I’m truly glad I did. Mukerjee is the real thing—a physician who seems to have maintained his humanity, his soul, his sensitivity to words, even as he maintains a killing schedule as an oncologist. Both aspects of his persona give him the necessary insights to write the book he has: a ‘biography of cancer’ that crackles with the suspense of a good mystery. The mystery, of course, involves finding the biology and genesis of cancer as a disease—the scientific pursuit—and ferreting out the medications that might promise a cure—the therapeutic pursuit. Both, though with a heavy emphasis on finding the cure, comprise what came to be known as The War on Cancer. It was a public-relations ploy designed to raise money to pay for research: money for trials for new procedures such as drugs for chemotherapy, surgery for tumor removal, X-rays to kill the offending cells; and also for the pure research into the fundamental biology of cancer cells to finally discover what, in fact, was causing cancer. What was the cellular malady that turned normally functioning cells into maniac proliferators of the tumors that were choking the body—always more and more bodies, it seemed—to death? And though the War on Cancer succeeded in raising an astonishing amount of money for cancer research and therapy, both from the U.S. Government and from private foundations, it never quite lived up to its promise. This is because the metaphor of war automatically implies taking aim and destroying an outside enemy—a virus or a bacterium that invades the body. But what Mukerjee leads us to in the end is the discovery, made gradually over the years, that the enemy is not something external to the human body. The enemy is within. Within the body. Deep within the cell. As Pogo once famously said, “we have met the enemy and it is us.”
            This is really the main thrust of Mukerjee’s book for me: cancer is not something that invades the body from without; cancer is a relentless and sometimes beautiful (Mukerjee actually uses this word) perversion of the most basic process of the human body: cell division or mitosis. The body must reproduce its cells constantly in order to live, to survive (blood cells are produced in our bone marrow at the astonishing rate of 300 billion per day!). And cancer hijacks this process in a way that makes it a virtual duplicate of ourselves. Here is how Mukerjee puts it early on:

To confront cancer is to encounter a parallel species, one perhaps more adapted to survival than even we are….This image—of cancer as our desperate, malevolent, contemporary doppelganger—is so haunting because it is at least partly true. A cancer cell is an astonishing perversion of the normal cell…Like the normal cell, the cancer cell relies on growth in the most basic, elemental sense: the division of one cell to form two (38).

What is even more mind-boggling is that cancer is not simply a fierce replica of our own ability to produce cells; the resultant cells also have the ability to evolve, to change in response to our attempts to kill or halt them. Mukerjee again:

Every generation of cancer cells creates a small number of cells that is genetically different from its parents. When a chemotherapeutic drug or the immune system attacks cancer, mutant clones that can resist the attack grow out. The fittest cancer cell survives (39).

So though we might want to think of cancer as simply a “dumb” result of basic chemical processes, we are forced to realize that cancer, like all “dumb” life, possesses a deep and deeply-ingrained intelligence. It recognizes attempts to extirpate it, bides its time, and works out strategies that allow it to survive and thrive (though here, as elsewhere in contemplating disease, I have never been able to quite figure out how “survival” fits a disease whose end game seems to be to destroy its host, and thereby, itself). Leukemia cells under attack from poisonous chemicals (combination chemotherapy), for example, seem to know enough to be able to migrate (metastasize) to the brain, where these chemicals are helpless to cross the blood-brain barrier. Mukerjee calls the brain, in this instance, “a natural ‘sanctuary’ for cancer within the body,” for a leukemia that seems almost conscious: “sensing an opportunity in that sanctuary, [it] had furtively climbed in, colonizing the one place that is fundamentally unreachable by chemotherapy” (147).
            Mukerjee gives us a detailed history of how cancer came to be recognized as a specific disease (as far back as ancient Egypt), and the many therapies developed to combat it: surgery (his description of mastectomies to extirpate breast cancer leaves us fascinated, and horrified at the more and more radical excisions that surgeons like William S. Halsted recommended in their mania to cut out every bit of a remaining cancer—all this mutilation, in the end, to no avail); chemotherapy, which found drugs almost by chance, by trial and error, and evolved to include higher and higher doses of more and more drugs, often leaving the patient half-dead from nausea (combination drug, X-ray and spinal-tap therapy was called, at St. Jude’s, “total hell”); to radiation therapy from higher and higher doses of X-rays, which themselves led to mutations; all in the effort to make the War on Cancer pay off with what was hopefully referred to as a “moon shot.” Mukerjee describes each of these phases in detail, often animated with case histories of some of his patients—the most memorable being Carla Reed. In her quest to stop her leukemia, we are told, Carla in 2004 entered “total hell,” visiting the clinic 66 times, with 58 blood tests, seven spinal taps, and several bone-marrow biopsies, in addition to multiple chemotherapies and radiations. Mukerjee cites a writer, a former nurse, describing a typical course of this “total therapy” at St. Jude’s hospital:

“From the time of his diagnosis, Eric’s illness had lasted 628 days. He had spent one quarter of these days either in a hospital bed or visiting the doctors. He had received more than 800 blood tests, numerous spinal and bone marrow taps, 30 X-rays, 120 biochemical tests, and more than 200 transfusions. No fewer than twenty doctors—hematologists, pulmonologists, neurologists, surgeons, specialists and so on—were involved in his treatment, not including the psychologist and a dozen nurses” (169).

But at least some of the patients suffering these agonies earned extensions of their lives. Mukerjee is harder on the results of the radical surgeries that were, and still, though rarely, are, the preferred treatment for breast cancer: 

Between 1891 and 1981, in the nearly 100 years of the radical mastectomy, an estimated 500,000 women underwent the procedure to “extirpate” cancer….Many were permanently disfigured; many perceived the surgery as a benediction…When radical surgery fell, an entire culture of surgery thus collapsed with it. The radical mastectomy is rarely, if ever, performed by surgeons today (201).

            The good news (if one can call it that) in Mukerjee’s story has to do with the long process of discovery about cancer biology, and the linking, finally, of these discoveries with therapies and drugs designed to match that knowledge. First, the discoveries (and it should be noted that all I can do here is provide a truncated sketch of what was and is a very complicated process). Beginning with a hunch by an Italian scientist named Boveri, biologists began to hone in on the mechanisms whereby the tightly regulated process of mitosis (cell division) in normal cells became chaotic in cancer cells. Bruce Ames, with his famous test on Salmonella bacteria in the late 1960s, found that a gene mutation would allow Salmonella to grow on sugar (galactose). He then saw that chemicals that scored high as mutagens (causing mutations) also tended to be carcinogens (causing cancer). Carcinogens, in short, had a common property: they could alter genes (mutation). One of the first carcinogens to be identified in the lab was a virus: the Rous Sarcoma Virus that could insert a viral gene into cells and make them cancerous. Though many scientists then became convinced that all cancer was caused by viruses, Howard Temin soon saw that it wasn’t the virus but what it had done that was key. By examining the Sarcoma virus, several scientists next found a specific gene, a single gene, that had done the damage. The gene was called src (pron. “sarc”), and it became one of a class called “oncogenes”—genes capable of causing cancer. It was then found how the src gene functioned: it encoded a protein whose main function was to modify other proteins by attaching a chemical, a phosphate group, to these proteins. Such protein enzymes were already known as kinases, and they acted as “molecular master switches,” often switching a cell “on” which then turned another “on” until with many cells turned “on” the target cell switched from a non-dividing to a dividing state, all under tight control. Src, by contrast, was a kinase on hyperdrive, turning normal cells into endlessly-dividing machines, the hallmark of cancer.
            One final mystery remained: how did src evolve into an oncogene? Two scientists at UCFS (University of California at San Francisco), J. Michael Bishop and Harold Varmus in the 1970s began to study src and came up with the solution. They discovered that src was not some foreign gene that had infiltrated normal cells; src was everywhere in normal cells from ducks to mice to fish to humans. But these normal src genes were not identical to the ones in the Rous virus. They were kinases, but not hyperactive ones; they were tightly regulated to act only during normal cell division. In short, they did not have the mutation that the viral genes had that made them permanently activated. Out of this, Varmus and Bishop developed a theory: normal src was a precursor to the cancer-causing viral src. It was a normal part of the cell, endogenous to the cell, that needed a mutation to turn it into a cancer-causer, an oncogene. Here is how Mukerjee sums up this vital discovery and insight:

The crucial implication of the Varmus and Bishop experiments was that a precursor of a cancer-causing gene—the “proto-oncogene,” as Bishop and Varmus called it—was a normal cellular gene. Mutations induced by chemicals or X-rays caused cancer not by “inserting” foreign genes into cells, but by activating such endogenous proto-oncogenes (362).

Mukerjee goes on to put this in historical perspective: “The Greeks had been prescient with their name for cancer, onkos (meaning load or burden). Cancer was intrinsically ‘loaded’ in our genome, awaiting activation” (362)—often by an environmental insult like cigarette smoke or radiation. He also cites a wonderful image from Harold Varmus’ speech when he and Bishop received the Nobel Prize in 1989, revealing the cancer cell to be “like Grendel (the monster in Beowulf), a distorted version of our normal selves” (363). 
            With many modifications and extensions (further research found that there were two “flavors” of cancer genes: positive ones like src that drive cell growth into hyperactivity [Bishop compared these to a “jammed accelerator”]; and negative genes, like Rb, that normally suppress cell division, but, with mutations, lose their suppressing function so that cell division goes on unhindered [as in “brakes” that don’t work]), this has become the dominant theory in cancer research. It has also, finally, led to targeted therapies that have allowed drugs to be specifically targeted to a specific kind of cancer-causing gene. Among these new targeted drugs was one called Herceptin, which targeted a breast-cancer oncogene labeled Her-2. In 1991, a patient named Bradfield was given Herceptin in combination with an older chemical, cisplatin, designed to kill breast cancer cells. Two months into her therapy, Bradfield’s neck tumor disappeared, and after 18 months of therapy, she was in full remission and survives today. Another is known as Gleevec, a drug developed by Ciba-Geigy (now Novartis) for Chronic Myeloid Leukemia or CML. Though it at first refused to spend money for drug trials for Gleevec (not enough patients would use it for Novartis to make money), Novartis finally relented and agreed to a few trials. As of 2009, CML patients treated with Gleevec were surviving an average of thirty years after diagnosis, proving that targeted cancer therapy really does work.
            But lest we forget, Mukerjee reminds us that cancer is the wiliest of all diseases. Soon, doctors were noticing that some cancers were demonstrating Gleevec resistance (similar to bacteria that become resistant to antibiotics). It is worth trying to describe this Gleevec resistance to demonstrate the phenomenal intelligence Mukerjee is at pains to make us see. I have already described cancers that migrate into the brain to escape drugs; but there is another, a cancer cell mutation that, almost fiendishly, activates the cellular pumps that normally rid the cell of natural poisons—to get rid of the chemotherapy drugs! Gleevec-resistant cells did something more astonishing: they acquired mutations that precisely altered the structure of the leukemia-causing oncogene Bcr-abl, “creating a protein still able to drive the growth of leukemia but no longer capable of binding to the drug” (442). That is, where normally Gleevec slips precisely into a “narrow, wedgelike cleft in the center of Bcr-abl” to literally pierce its heart and kill it, the mutations altered this molecular “heart” so that the drug could no longer penetrate it (it no longer fit), thus making the mutated cancer immune. As Mukerjee puts it, “To escape targeted therapy, cancer had changed the target” (442).
            I don’t know about you, but this kind of (apparently) non-cognitive intelligence, even in a form that most of us would not hesitate to call “evil,” leaves me gasping for words. It does the same to Mukerjee, though he is quite adept at providing brilliant phrases and sentences in this captivating book. But let me give you some of the thoughts that it has evoked in me, and then end with Mukerjee again. It occurred to me today that if cancer is a perversion of our normal selves, our normal processes, as Mukerjee says, then we might say that it is a perversion because it is uniquely and brilliantly concerned only with its own survival. I have always had a problem with those who insist that the only thing that matters in life is survival. Because if survival is the end game, then cancer does it even better than we do. Cancer, as we see countless times in Mukerjee’s story, is the ultimate survivor. He even says, at one point, “Some day, if a cancer succeeds (in finding immortality), it will produce a far more perfect being than its host—imbued with both immortality and the drive to proliferate” (459). Cancer, that is, is better than we humans are at survival. The question becomes, is that all we are? Are we simply here to survive? Designed to outlast everything and everyone else? If that is the case, then we are on the “right” path, moving ourselves and the planet towards destruction as we do so. In this way, we are indeed just like cancer. Our monomaniacal drive for survival moves inevitably towards the destruction of our host, the only home we know. Which is why it is here that I part company with the survivalists. Though it is difficult to say how, and to what degree, humans, human being, is/are more than simply the number of years we survive or the sum of those who survive. Human being involves others, involves all other being. It matters to humans if others survive. It must matter, as we know from the cases where such mattering is thrust aside—and we see the horrors that have marked our century, and the horrors that may yet be coming if we do nothing but look out for our own survival, either as individuals, as a country, as a continent, as a hemisphere. We can easily predict the outcome of that kind of survivalism; we are already getting a taste of it today. No. Human being, again, is more than mere survival, and that is how we differ from cancer.
            So though Mukerjee ends with a reiteration of his overall theme (“Cancer is a flaw in our growth, but this flaw is deeply entrenched in ourselves. We can rid ourselves of cancer, then, only as much as we can rid ourselves of the processes in our physiology that depend on growth—aging, regeneration, healing, reproduction”), when he says this he is speaking strictly as a scientist, as someone looking at cancer, at humans, as strictly physical processes. Though I have nothing but admiration for his ability to do this, and for his ability to make us see how elegant and intertwined this dread disease is with our own fate, I part company with him here. For here, after all, is where we as humans have the capacity to look deeply into this flaw in our being, contemplate it, even come to terms with it, perhaps accept it (at least in the abstract)—and in so doing, comprehend it. We, that is, comprehend it; it does not comprehend us, other than as obstacles to its survival. And that makes all the difference.

Lawrence DiStasi


Thursday, July 28, 2016

Death of a Comrade


I’m not sure what I want to write about it, about the death yesterday of my old friend, Gian; except that I feel the need to write something. It brings to mind the opening piece I wrote for Una Storia Segreta (Heyday: 2001), which I titled after a line in Prospero Cecconi’s notebook, “Morto il camerata Protto.” Cecconi was referring to the death of his friend Giuseppe Protto, when both were domestic internees (yes, so-called ‘enemy aliens’ of Italian descent were interned during that war after being judged “potentially dangerous”) imprisoned at Camp Forrest, TN during World War II. Cecconi, too, felt the need to write something in his clandestine notebook, though all he could bring himself to write was that simple line, Morto il camerata Protto; ‘my comrade Protto is dead.’ It was enough; years later, we can feel the pain, the loss, the loneliness in that lone line.
            Now, my friend Gian is dead. È morto lui. And though I have more time and more skill with language to write volumes about it, there really is very little to add. He’s dead. My friend Gian, whom I’ve known since the mid 1970s, and with whom I’ve laughed and joked and written and studied and cooked and celebrated our common heritage—we organized a little group we called the circolo in the early 1990s; and would gather once a month to cook together (they were sumptuous feasts) and laugh together and reminisce about our Italian parents and childhoods and the foods we used to eat—my friend Gianni is gone.
            At first I took it rather philosophically. Yes, I knew he was ill and in hospital where I got to see him a week or two ago. Yes, I knew he had been placed on nothing but palliative care and was certain to slip into oblivion sooner rather than later. Yes, I had been expecting the call for days. And yes, I have more or less accepted the fact of death, the fact that we all die, that nothing is more certain than the death which is a necessity of our existence and often a blessing. But when it came, something internal shifted. I didn’t even notice it at first. I busied myself with finding some mementos I could contribute to an expected memorial service, some of his drawings, some writings about him and his vintage kitchen and vintage 1940s décor and vintage humor that kept me busy most of the afternoon yesterday. But in the night I began to realize that I was grieving, albeit not in the way we think of as grieving: no tears, no depression to speak of, no laments about the futility of life or the too-early death of this life, or how I would miss him. No. There was mainly this sense of drift. I suddenly felt unmoored. It was as if an anchor in my very life had come loose—but not a literal anchor; some inner anchor that was more like a void or an eraser that had left me, or part of me, vacant. Adrift. Easily blown away. This happens more, perhaps, when one is older and the friends and relatives that remain get fewer and farther between. I don’t know. All I know is that Gianni was my close friend, someone I could always count on to be in my imagined gallery of people to speak to or places like his unique house and kitchen to go to, to sit and drink a companionable glass of wine and complain or joke or laugh or cogitate about the follies of the world with. For instance, there was the time I had been on my zen walk and was heading home through Berkeley, and simply dropped in for a quick rest and a cup of coffee and he immediately saw something different in me, some spiritual light in me that no one else would have seen much less valued.
            And now that real space, that imagined space is gone. Empty. The weight of it, that’s what strikes me most. The weight it provided in my life, the ballast that kept things secure and at least partly known—which is what a friend is, what a relative is, a known weight or quantity to keep one firmly in place—was gone; is gone. He is no more. Though I can still conjure up his looks and his speech patterns and his laughter, the man himself is no more. The weight of him. The solidity of him. The actual belly and blood of him.
            And how peculiar it really is, this sense of another. We can imagine it sometimes. We can still find the outlines in our image drawer in the mind. But we know, after death, that it is only an image. It has no flesh to it. No bones to it. No scent or feel to it. No response to it because it can’t talk back. It can’t provide the answering weight of itself to our own presence because it has no presence anymore. And presence—what a person actually is in the flesh—though it’s almost impossible to express, is something we know. Know without any reflection or reason that that’s what a person is, really. That presence. And it is not captured in drawings or photos or films or any medium but itself. Part of it can be captured, the part that is analogous to what we can conjure on our mental screens. But the real fullness of it, the living presence of another human or animal or tree or flower—that is never, cannot be ever captured in any of our media. We are fooled that it is. We are fooled into thinking that we really know those we see on TV or on our computer screens or in our smartphones. We don’t. All we know is shadows, poor bereft shadows that have no weight, no depth, no life. Which is what we’re left with when someone dies. Shadows. I still have the shadow of Gianni. But his presence, his weight, his laugh, his life—that is gone forever. And in some terrible way, it makes me lighter, more fleeting, more adrift.
            That is what we grieve. We grieve, I grieve the loss of that unique, indispensable, never-to-be-repeated presence that was Gian Banchero. That will never come again. So simple: È morto lui. So commonplace: È morto lui. And yet so deeply, unfathomably vacant, empty, weightless, gone.

Lawrence DiStasi

Tuesday, July 26, 2016

The Gene and its Discontents


Notwithstanding its beautifully-rendered history of how scientists finally, after 2500 years of speculation, finally discovered and named the “gene” as the mechanism of heredity (Darwin had no idea of this mechanism, speculating about tiny things he called “gemmules”) for me, the most fascinating parts of The Gene, by Siddhartha Mukerjee (Scribner: 2016), are the materials on Eugenics. Originated by Francis Galton (Darwin’s cousin) in the late 19th century, “eugenics” refers to the idea that humans should try to select the “best” genes from among human populations and selectively advance those “good” genes and eliminate the “bad” ones to produce a race of “perfect” humans. In a lecture at the London School of Economics in 1904, Galton proposed that Eugenics “had to be introduced to the national consciousness like a new religion.” Arguing that it was always better to be healthy than sick, and ‘good’ rather than ‘bad’ specimens of their kind, he proposed that mankind should be engaged in selectively breeding the best, the good, the strong. As Mukerjee quotes him: “If unsuitable marriages from the eugenic point of view were banned socially…very few would be made” (p. 73), the mechanism to promote ‘suitable marriages’ being a kind of golden studbook from which the “best” men and women could be chosen to breed their optimal offspring. No less a figure than H.G. Wells agreed with Galton, as did many others in England who even then were expressing fear about the inferior working classes out-breeding the better classes. Galton founded the Eugenics Review in 1909 to further advance his ideas, but died the next year before he could really get eugenics going in England. But other countries like Germany and the United States were already taking steps to follow Galton’s lead. Indeed, at the first International Conference on Eugenics that was held in London in 1912, one of the main presenters was an American named Bleecker Van Wagenen. Van Wagenen spoke enthusiastically about efforts already underway in the United States to eliminate “defective strains” (of humans) in America, one of which involved confinement centers—called “colonies”—for the genetically unfit. These were the target of committees formed to consider the sterilization of ‘unfit’ humans such as epileptics, criminals, deaf-mutes, and those with various ‘defects’ of the eyes, bones, and mind (schizophrenics, manic depressives, the generally insane.) As Van Wagenen suggested, 

Nearly ten percent of the total population…are of inferior blood, and they are totally unfitted to become the parents of useful citizens…In eight of the states of the Union, there are laws, authorizing or requiring sterilization (77).

            Van Wagenen was not kidding. The United States continued its misreading of Darwin and its enthusiasm for sterilizing the  ‘unfit’ well into the 20th century, and it was not just the lunatic fringe that was involved. Mukerjee cites a famous case that came before the Supreme Court in 1927, Buck v. Bell. This case concerned one Carrie Buck, a Charlottesville, Virginia woman whose mother, Emma Buck, had been placed in the Virginia State Colony for Epileptics and the Feebleminded after she was accused of immorality, prostitution and having syphilis. In fact, Emma Buck was simply a poor white woman with three children who had been abandoned by her husband. No matter; she was judged ‘unfit’ and, with her mother confined, little Carrie was placed in a foster home, was removed from school by her foster parents to work, and at age 17 became pregnant. Her foster parents, John and Alice Dobbs, then had her committed to the same State Colony for the Feebleminded on the grounds of feeblemindedness and promiscuity, where Carrie gave birth in March 1924 to a daughter, Vivian. But having been declared mentally incompetent, Carrie was unable to stop the Dobbs from adopting her baby. (One reason the Dobbs may have wanted the baby was that it later turned out that Carrie’s pregnancy was the result of a rape by the Dobbs’s  nephew). Carrie was quickly scheduled to be sterilized, and the Supreme Court case of Buck v. Bell was brought to test the sterilization law—the 1924 Virginia Sterilization Act—to which Carrie Buck was subject, being already in a state institution for the feebleminded. Astonishingly, with the ‘great’ Oliver Wendell Holmes presiding, the Supreme Court voted 8 to 1 that the Sterilization Act did not violate the U.S. Constitution’s due process provisions—since Carrie Buck had been given a hearing, and since she was already confined to a state institution. Mukerjee cites some of the now-infamous ruling by Holmes:

It is better for all the world, if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind. The principle that sustains compulsory vaccination is broad enough to cover cutting the Fallopian tubes…Three generations of imbeciles is enough (83-4).

In accordance with the Supreme Court’s ruling, on October 19, 1927, Carrie Buck was sterilized by tubal ligation. The fact that her daughter Vivian—the ‘third generation imbecile’ Holmes referred to—had performed adequately in the school she attended, being of decidedly average intelligence, did not save her; nor, for that matter, did it save Carrie’s sister Doris, who was also sterilized, without her knowledge, when she had her appendix removed. After this, sterilization was free to spread in the United States; in 1927, for instance, the great state of Indiana revised an earlier sterilization law to cover “confirmed criminals, idiots, imbeciles and rapists,” with other states following suit. Pre-marital genetic fitness tests became widespread, as did Better Babies contests at State Fairs. With the help of practical ‘genetics,’ America was out to produce a race of perfect humans fitted to its already ‘perfect’ political system and ‘perfect’ society.
            The logical next step in the Eugenics movement came, of course, in Nazi Germany. In 1933, Mukerjee tells us, the Nazis enacted the Law for the Prevention of Genetically Diseased Offspring, aka the Sterilization Law. Its premises were borrowed directly from America’s own program: “Anyone suffering from a hereditary disease can be sterilized by a surgical operation,” the diseases to include mental deficiency, schizophrenia, epilepsy, depression, blindness, deafness, and other serious deformities (121). Any cases in dispute were referred to a Eugenics Court, whose rulings allowed for no appeal. With films like Das Erbe (the Inheritance, 1935) propagandizing in its favor, the law became a grim model of  efficiency, with 5,000 adults being sterilized each month by 1934. And as with its other better-known programs, the Nazis moved smoothly and efficiently to the next step—euthanasia. A Scientific Registry of Serious Hereditary and Congenital Illnesses was set up, devoted to euthanizing (i.e. killing) defectives permanently to ‘purify’ the gene pool. The Nazis coined a macabre euphemism to justify all this, by perverting Socrates’ famous dictum about “the unexamined life not being worth living” into its macabre opposite: the euthanized were characterized as having lebesunwertes Leben, ‘lives unworthy of living.’ Though at first the targets were limited to children under three, soon the net was extended to adolescents, then juvenile delinquents, and finally in October 1939 to adults, with Jews at first conveniently labeled “genetically sick.” Typically, the Nazis set aside a villa, No. 4 Tiergartenstrasse in Berlin, as the official HQ of their euthanasia program, a place eventually known as Aktion T4, for its street address. Mukerjee at this point gives us one of his trademark elegant sentences: 

But it is impossible to separate this apprenticeship in savagery from its fully mature incarnation; it was in this kindergarten of eugenic barbarism that the Nazis learned the alphabets of their trade….The dehumanization of the mentally ill and physically disabled (“they cannot think or act like us”) was a warm-up act to the dehumanization of Jews (“they do not think or act like us.”) 125.  (my emphasis).

            There is other fascinating material in this altogether fascinating book, but I will leave most of that to other readers to discover. What I should like to stress is what Mukerjee himself stresses about genes, the genetic code, and eugenics. First, that genes, contrary to common perceptions, are not blueprints that form every element of an organism. Rather, they are like recipes—in that, just as recipes provide instructions about the process of cooking something, similarly, genes provide instructions about the process of building an organism. And as with a recipe, lots of chance or even intentional events can produce all sorts of variants. And, of course, the chance event par excellence, is the mutation. The problem is that humans, and especially humans who get seduced by the prospect of either eliminating “bad” mutations, or selecting for the “best” ones, misinterpret what mutations are and how they function in evolution. Citing the realization of Dr. Victor McKusick, Mukerjee makes the critical distinction that he wants everyone to grasp—that a mutation is a “statistical entity, not a pathological or moral one.” A mutation doesn’t imply something bad, like disease, nor even a gain or loss of function:

In a formal sense, a mutation is defined only by its deviation from the norm (the opposite of “mutant” is not “normal” but “wild type”—i.e. the type or variant found more commonly in the wild). A mutation is thus a statistical, rather than normative, concept. A tall man parachuted into a nation of dwarfs is a mutant, as is a blond child born in a country of brunettes—and both are “mutants” in precisely the same sense that a boy with Marfan syndrome is a mutant among non-Marfan, i.e., “normal,” children (264).

This distinction is critical, especially as regards the benighted attempts to create perfect humans or a race of normal humans. What we call “normal” is merely that which seems to be fitted to a given time, place, and conditions. To try to select for this “normalcy” is to completely misunderstand what genetics and evolution tell us. The “fittest” are not those who have won some sort of evolutionary or genetic race that is good for all time. They are simply those who may have turned out to be well-adapted to a given set of environmental and social circumstances. The worst conclusion one could draw from such “fitness” would be a) to decide to select only for those adaptations and exclude all others; or b) to try to interfere in genomes and eliminate all genetic variants in the vain hope that humans could be bred free of all illness or ‘unfitness.’ Conditions inevitably change. We have no idea what conditions might eventuate that might require some of the variants that we would like to prune out of existence—and prune is the accurate word here, leading us, as it does, to our modern mania to favor certain varieties of, say, apples or corn or wheat, while completely obviating the thousands of varieties that have evolved over centuries. This is a kind of ‘vegetable eugenics’ that many botanists have warned could leave the world without staple crops in the event of a pathogen that wipes out the now-dominant varieties. In short, a diverse gene pool is an absolute necessity for evolution to proceed.
            Yet despite the disfavor that eugenics has encountered in our time, the kind of thinking that fosters it is far from dead. Mukerjee cites a case from 1969, where a woman named Hetty Park gave birth to a daughter with polycystic kidney disease, leading to the child’s rapid death. Park’s obstetrician thereupon assured her that the disease was not genetic, and that there was no reason she should not have another healthy child. Park conceived again, but sadly the same result ensued; whereupon Park sued her obstetrician, for bad advice, and won. The court ruled that “the right of a child to be born free of [genetic] anomalies is a fundamental right.” Mukerjee points out that “this was eugenics reincarnated.” In other words, the court had ratified an expectation that the particular genetic mutation that caused harm to the Park family violated their rights—in effect, that that mutation should not exist. In the coming world of gene manipulation, we can expect that many mutations that now are classed as “abnormal” will be similarly classified and excised from existence. But as Mukerjee reminds us again and again, if we can expect anything, we can expect that conditions will certainly change. What appears “normal” now may one day be considered to have had only temporary value, suited to a very specific time and place. As Mukerjee notes at the end of his book, “Normalcy is the antithesis of evolution” (481). That is, though we have come to distrust and despise “mutations” that compromise what we consider ‘normal,’ evolution absolutely requires them, requires a gene pool that is as varied and diverse as it can be. Mutations are the lifeblood of such diversity, the bank on which evolution relies to adapt to always new circumstances. And equally important, evolution does not proceed according to human wants or needs or the wants or needs of any organism. Evolution proceeds according to what works, what is adaptable to a given circumstance at a given point in time. There is no good or bad adaptation. There is no good or bad mutation. There is no “normal” much less “best” genome or genetic code. No one can ever know what might be needed. So before humans go about eliminating that which appears negative or useless in any given era, they should think twice about eliminating precisely that which might one day prove salvational. Here is how Mukerjee puts it towards the end of his book:

“Gene editing,” the stem cell biologist George Daley noted, “raises the most fundamental issues about how we are going to view our humanity in the future and whether we are going to take the dramatic step of modifying our own germ line and in a sense take control of our genetic destiny, which raises enormous perils for humanity” (479).

Siddartha Mukerjee employs the perils of eugenics to serve as an object lesson that those ‘enormous perils’ of humans ‘modifying our own germ line’ are perils not just for humans, but for all the life on this planet.

Lawrence DiStasi

Friday, July 22, 2016

TrumpSpeak


My title, as many of you will recognize, is a variant of the word “NewSpeak” from George Orwell’s dystopian novel, 1984. Whether one should credit Donald Trump with coining a new form of speech may be questionable, but watching his performance last night, I was struck not so much by the laughable misrepresentation of almost all his alleged “facts” (if you want a good rundown of how each of Trump’s ‘factoids’ were grossly exaggerated, de-contextualized or outright lied about, see the Washington Post piece here: https://www.washingtonpost.com/news/fact-checker/wp/2016/07/22/fact-checking-donald-trumps-acceptance-speech-at-the-2016-rnc/), but by his speech patterns. (In case you’ve forgotten, one of the reasons fact-checking doesn’t matter much for a Trump audience has to do with their ‘stone-age brains.’ Briefly, most people employ a quick, instinctive estimate, done in milliseconds, of a politician’s looks and/or manner, and completely bypass the reasoning process behind the information he delivers. This accords with the stone-age brains most of us still work with in interpersonal relations.)
            So, for now, let’s bypass the howlers Trump spouted in his overly long but factually empty speech, and attend instead to the patterns of rhetoric he used. To begin with, the man seems to be mostly driven—both in his domestic critiques and his foreign ones—by the notion of “getting a good deal.” This would figure, since his life seems to have been devoted to deal-making in the high-risk world of (mostly) Manhattan real estate. It is a world dominated by con men and hucksters who are always out to screw the naïve or the unwary. The New Yorker, therefore, must always be on his guard to make sure he’s not being screwed. This applies to all New Yorkers in all areas of life, but especially to those engaged in the dog-eat-dog world of real estate developing. Accordingly, Donald Trump’s rhetoric is full of critiques of his predecessors like Hillary and Obama and Bill Clinton for “not getting a good deal.” In his eyes, they gave away the store in the Iran nuclear deal; they gave away the store in Libya and Syria and Russia and China and especially in trade deals like NAFTA and the upcoming TPP. In short, previous political leaders succumbed to the cardinal sin in Trump’s world: they didn’t negotiate hard or cleverly enough; weren’t willing enough to play hardball; weren’t willing enough to talk tough and walk away and threaten and harangue. Now, of course, Trump has no way of knowing this; he wasn’t there; has never been engaged in any diplomatic activity or anything remotely political; and certainly is not about to consider the way that the United States has totally dominated and exploited almost every relationship it has entered in the post-World War II years. No. All he’s willing to bray about is how weak the nation has become, i.e. how it can no longer dictate the terms of every agreement due to its position as the biggest, baddest, most powerful nation on the globe. So he claims that he, the great real estate wheeler-dealer, will be able to make ‘better deals’—even, presumably, with those shirkers at home who want a free lunch.  
            And that brings us to the second noticeable rhetorical pattern. Trump never explains exactly how he’s going to accomplish all this. All he does is, first, exaggerate the problem—we’re besieged by criminals and loafers domestically and by terrorists from abroad, our cities are falling apart, our industry has all left for cheaper shores due to bad trade deals, cops are being murdered at the highest rate ever—and then assert that he’s the one who, with his superior deal-making ability, will fix the problem. Crime will end. Immigration will end. Terrorism will end. Globalization will end. Inner-city poverty will end. And he, Donald Trump, will end it.
            But how? These are complex, difficult problems that Republicans and Democrats alike have been promising to solve for decades. Not for Trump. The language is simple, the problems are simple, the solution is simple: Put Trump in Charge. And soon, trillions of dollars will be pouring into the nation’s coffers, taxes will be far lower saving everyone more trillions, roads will be built, infrastructure will be modernized, onerous regulations will disappear freeing up our energy sources (never mind the pollution or global warming) and pouring in even more trillions, and American Will Be Great Again.
            It is simple. And it is simpleminded. And the stone-age brains crowding the Republican Convention could not cheer loud enough or stomp hard enough or chant USA! USA! USA! often enough to roar their approval. Their devotion, even. Their lord and savior was saying it. He was saying it with confidence and certainty and with his jaw jutting out like some latter day Benito Mussolini, and they were ecstatic (as Mussolini’s crowds often were). He would talk tough. He would be tough. He would just take those over-educated fancy-nancy diplomats and bureaucrats by the throat, saying ‘fuck your reasoning and diplomacy and equity,’ and force them to give him a good deal. And if they didn’t, he’d bomb the shit out of them.
            And that’s it. After the longest speech in convention history, Donald Trump managed to say virtually nothing but the same posturing, simple-minded crap he’s been spouting throughout his primary campaign. Leaving the rest of us, the ones searching for some sort of program or plan or logic to his meandering speech, to wonder: how can they swallow this infantile pap? How can they not see that this guy has no capacity for any thought that’s longer than a sentence or two? Did you notice that? He never stayed with one subject for any sustained length of time: it was all quick cuts, as in a commercial. Crime in the streets. Shooting cops. Terrorists. Hillary and Libya, Iraq, Syria, Egypt. NAFTA. China. Back to high unemployment. Obama care. It reminded me of what was revealed in Jane Mayer’s recent article in the New Yorker where she interviewed Trump’s ghostwriter Tony Schwartz (he wrote The Art of the Deal for Trump)—i.e. that Trump had no capacity whatever to focus on anything for longer than a minute or two. Trying to interview Trump, said Schwartz, was like trying to interview a chimp with ADHD (my metaphor). The man had no capacity to concentrate at all, so Schwartz ended up following Trump around, listening in on phone calls and interactions and inspections, to scare up material for the book. The other thing Schwartz noticed—after Trump threatened him with a lawsuit and demanded that he return all the royalties Schwartz had earned from the bestseller—is that Trump’s famously thin skin demands that he instantly attack anyone who criticizes him. We all saw that in this Spring’s Republican debates. What Schwartz reminds us is how frightening this quality would be in a President: 

“The fact that Trump would take time out of convention week to worry about a critic is evidence to me not only of how thin-skinned he is, but also of how misplaced his priorities are,” Schwartz wrote. He added, “It is axiomatic that when Trump feels attacked, he will strike back. That’s precisely what’s so frightening about his becoming president.” (Jane Mayer, “Donald Trump Threatens the Ghostwriter of ‘The Art of the Deal’”, New Yorker, July 20, 2016.)

            Donald Trump, in short, gave one of the most consistently alarmist acceptance speeches in American political history last night. But what we should truly be alarmed about is ever ceding the enormous responsibility and power of the American presidency to a man who is so ill-equipped—emotionally, mentally, and morally—to handle it. For if, with the help of a gang of speechwriters, he is unable or unwilling to put together a cogent argument that at least attempts to fill in some of the missing spaces of TrumpSpeak, then every American with an ounce of sense should be terrified about how those missing spaces might eventually take some reckless, cataclysmic shape.

Lawrence DiStasi

Friday, July 8, 2016

From Chilcot to ISIS


The bombshell in Britain in recent days has been the long-awaited (seven years in the making) report by Sir John Chilcot condemning Britain’s role in the 2003 invasion of Iraq. Most Britons, like most Americans, have long since concluded that the invasion was a disaster. But though the report fails to assign legal culpability (which many Britons who lost loved ones in the invasion hope to get), it does roast former prime minister Tony Blair pretty thoroughly. It says, in part, that his

 “judgements about the severity of the threat posed by Iraq’s weapons of mass destruction—WMD—were presented with a certainty that was not justified” and “Despite explicit warnings, the consequences of the invasion were underestimated….It is now clear that policy on Iraq was made on the basis of flawed intelligence and assessments. They were not challenged, and they should have been.”

It also explicitly condemns Blair (known in Britain as ‘Bush’s poodle’), for blindly following the lead of President Bush, citing a letter Blair wrote in July 2002 promising that “I will be with you whatever…” This constituted Blair’s only success according to the report, i.e., successfully appeasing George W. Bush.
            That the report took seven years to appear is in part attributed (by a 2003 report in London’s Independent cited in Alternet’s account of the Chilcot release) to a “fierce battle” waged by the U.S. State Department and the White House as early as 2003 to block release of the report because it allegedly contained “classified information.” Whether the release of the report in 2003 would have saved lives, either British or Iraqi, is not known, but it might at least have caused some re-evaluation of the Bush administration’s rationale for the invasion, which in turn might have led to Bush’s defeat in the 2004 election. Instead, of course, we got four more years of the worst presidency in history.
            If this were the end of it, the Iraq war blunder would still count as a horror costing millions of lives, but not as grave or extended a one as what it subsequently turned out to be. For the current plague of ISIS attacks in Iraq, Syria and now throughout the Middle East and the world, stems directly from the hubris and secrecy of the Bush Administration during that time. This is made clear in a recent (first aired May 17, and again, when I saw it, on July 5) Frontline documentary: The Secret History of ISIS  (http://www.pbs.org/wgbh/frontline/film/the-secret-history-of-isis/). What the documentary reveals is how ISIS was able to thrive and grow through a series of blunders—mainly driven by “optics”—regarding its first leader, one Abu Musab al Zarqawi. We learn that Zarqawi was known to the CIA even before the invasion in 2003: according to Nada Bakos, a CIA analyst charged with looking into his background, Zarqawi was a tough kid who grew up in a tough neighborhood in Jordan, one who appeared on his way to a lifetime in prison as a thug, pimp, and general hardass covered with tattoos. But one stint in prison radically changed him: he became a jihadist, a holy warrior; and to demonstrate his zeal, he actually removed his tattoos by using a razor blade to cut off his outer layer of skin. After that, he left Jordan for Kandahar in Afghanistan, determined to join up with Osama bin Laden. But bin Laden ignored this wannabe from Jordan and in 2002, Zarqawi saw a chance to strike out on his own, this time in Iraq. He set himself up near the Iran/Iraq border and began building his den of crazies. Fortunately, the CIA had an informant in Zarqawi’s camp, saw him as a definite threat in the event of an invasion, particularly as Zarqawi’s group was apparently trying to build chemical and biological weapons. CIA analyst Sam Faddis, assigned to the case, therefore formed a plan to take him out, and forwarded the attack plan to the White House for approval.
            But the White House, in the person of VP Dick Cheney and his aide Scooter Libby, wanted no part of the takeout, especially before the big invasion, so Cheney and Libby drove to the CIA to undermine the CIA’s information. From their aggressive questioning, it was clear that the White House had more in mind than simply worry about a strike that might pre-empt their war plans. They had cocked up a narrative concerning Saddam Hussein’s al Quaeda connection and involvement in 9/11 as a big part of their casus belli. And when the CIA said there was no connection, it was clear that Cheney/Libby badly wanted there to be one. This would eventually lead to Colin Powell’s memorable speech at the UN, in which the Secretary of State besmirched his reputation by accepting the White House’s script—which he, uncharacteristically, read verbatim at the UN. And though the White House appeared to follow protocol by sending the script to the CIA for vetting, Nada Bakos testifies in the documentary that the White House simply ignored the CIA’s corrections and stayed with their required script. As Colin Powell authoritatively put it: 

“…there’s a sinister nexus between Iraq and terrorist networks. Iraq today harbors a deadly terrorist network headed by Abu Musab al Zarqawi, an associate and collaborator of Osama bin Laden and his lieutenants.”

When confronted in the Frontline documentary about this clear fabrication in his UN speech, Colin Powell claims that his memory now is vague, but insists that his references to Zarqawi were unimportant to his general case. The truth is that a full seven minutes of the Powell speech were devoted to Zarqawi who is mentioned no less than 21 times, thus firmly connecting Iraq and Saddam to the terrorist network that had already attacked the United States on 9/11. Not incidentally, Powell’s speech also transformed Zarqawi into a major terrorist directing a worldwide terror organization. It is almost as if Colin Powell created Zarqawi, and ISIS, at that very moment.
            From this point, everything that the United States did played into Zarqawi’s hands. First came shock and awe, tearing apart a nation. Then came Paul Bremer, the moron placed in charge of the Iraq Provisional Authority, who not only dismantled the entire governmental structure of Iraq, but then fired the entire military, leaving some quarter of a million experienced soldiers without a job or means of livelihood. Zarqawi wasted no time in recruiting thousands of these Sunni ex-soldiers, and they today form a major portion of the ISIS forces. Even General David Petraeus testifies in the documentary that the effect of Bremer’s move was “devastating” and planted the seeds of the insurgency. Zarqawi’s attacks began almost immediately, with devastating car bombs that turned Baghdad and the rest of Iraq into a charnel house of raging sectarian war. That he planned to do this was clear from a letter Zarqawi wrote laying out his plans. He wanted Iraq torn apart by sectarian conflict, he wrote, that would leave it vulnerable to his more ambitious plans to create a caliphate. Bombing the UN headquarters added to the chaos because both the UN and all the NGO organizations that might have provided some protection and order, immediately fled Iraq.
            It was at this point that Nada Bakos sent a briefing document to the White House saying specifically that Zarqawi was responsible for the major attacks and was looking to foment a civil war. It got to Scooter Libby, who then called Bakos and summoned her to his office, clearly to pressure her to change her main conclusion, i.e., that there was an insurgency in Iraq that threatened the entire American project. It was that word, insurgency, that the White House found toxic. It implied that the Iraqi people weren’t completely overjoyed about the American invasion. Again, it was the optics that the White House wanted to change. So the White House, especially Donald Rumsfeld in press conferences, ridiculed news reports that only focused on the alleged chaos—which they vociferously denied. The denial, of course, made it impossible to combat the insurgency, which was allowed to grow unhindered.  
            Zarqawi made the most of such denial. He instituted a reign of terror that had never been seen before, beheading American Nicholas Berg on camera to establish his credentials as a slaughterer of epic proportions (one of his monikers was the Sheikh of the Slaughterers). And though even Osama bin Laden tried to slow him down, objecting to the killing of muslims by other muslims, Zarqawi’s response was to blow up the most sacred Shia site in Iraq, the Golden Dome of Samara. This was the final straw for Shias, and all-out sectarian war ensued—exactly what Zarqawi wanted. Shortly thereafter, he showed himself on camera firing an American automatic weapon to emphasize his power and ruthlessness, as well as  his plan to set up an Islamic state as the first step in forming a global caliphate.
            We know the rest. Even though Abu musab al Zarqawi was finally killed in a drone strike, his fiendish methods and plans have been continued by his even more ruthless successor, Abu Bakr al Baghdadi. What’s most disturbing is that all of this—the destruction of Iraq in the first place, the refusal to take out Zarqawi when the CIA wanted to, the idiocy of disbanding and setting adrift a quarter million potential fighters from the former Iraqi army, the mania to sanitize and justify the whole bit of lunacy in the first place—all of it might have been prevented if saner heads had prevailed. But of course, that is what marks the late lamented Bush Administration: lunacy and hubris (and an optimistic savagery) from top to bottom. At this point—with so many lives lost or ruined, and the Middle East in unprecedented chaos—all we can do is hope we shall never see its like again.

Lawrence DiStasi

Saturday, July 2, 2016

'Killer App' Addendum

-->                                             
Scanning my college alumni magazine, I came across a piece by Judith Hertog  called “A Monitored State.” Since it relates closely to my earlier blog, Killer App, I thought its report might be useful here as a gloss on that piece. “A Monitored State” describes Dartmouth professor Andrew Campbell’s experiment monitoring student behavior via the smartphones that virtually all carry and use constantly. A paper he wrote described how smartphone sensor data “contain such detailed information about a user’s behavior that researchers can predict the user’s GPA (grade point average) or identify a user who suffers from depression or anxiety.” In this study, called Student Life, 48 student volunteers allowed Campbell’s team to gather a stream of data via an app installed on their smartphones. The app “tracked and downloaded information from each phone’s microphone, camera, light sensor, GPS, accelerometer and other sensors” and then uploaded it to a database. By analyzing the data, Campbell’s researchers were able to record details about each student’s location, study habits, parties attended, exercise programs, and sleep patterns. For at least two students, Campbell was even able to see signs of depression: “I could see they were not interacting with other people, and one was not leaving his room at all,” Campbell said. Both failed to show up for finals, whereupon Campbell gave them incompletes and encouraged them to return in the fall to complete his and other courses with success. What Campbell draws from this is that, in the future, not only will universities be able to intervene to help students in such situations, but such information will be available in real time to monitor everything, including the state of every student’s mental well-being.
            Campbell has also collaborated with brain science colleagues “to discover how smartphone sensor data can be combined with information from fMRI scans” in order to eventually create apps that not only identify mental problems but also “intervene before a breakdown occurs.” In fact, in a follow-up phase of his study, he got student volunteers to submit to fMRI scans, and wear a Microsoft smart band that collected body signals like heart rate, body temperature, sleep patterns, and galvanic skin response—all associated with stress. Thus, more than simple behaviors, today’s technologies can (and already do) detect, grossly at least, an individual’s state of mind. One of Campbell’s colleagues predicts that in addition to being able to predict which individuals are “most susceptible to weight gain,” smartphones of the future will be able to warn when “its owner enters a fast-food restaurant.”
            The potential threat from all these technologies has not been lost on Campbell and his colleagues. His collaborator, Prof. Todd Heatherton, is already worried about a future determined by the constant collection of the data monitored by smartphones, and its use by companies, insurance underwriters, for instance, to determine who gets insurance and how much they pay for it. Heatherton was also shocked by how casual students were about sharing such personal data for his study. But clearly, this generation is already used to sharing just about everything on apps like Find Friends (an app that broadcasts one’s location to everyone in one’s network). For Heatherton and others, this raises important questions about the ethics of all this technology and how far it can be used to monitor every detail of our lives. James Moore, a Dartmouth philosophy professor specializing in ethics, worries how information about a person’s entire life could be used by governments wanting, for just one example, to monitor those on welfare. Or totalitarian governments that could use such data to keep potentially rebellious populations under rigid control.
            Campbell himself worries about the same thing, hoping that legislation will be forthcoming that will at least give individuals ownership of their own data (now being used by Google and many others for commercial purposes and more). People need to think about this, he says, and realize that “we are turning into a monitored state.” Or perhaps already are.
            Even George Orwell couldn’t have imagined such an easily ‘big-brothered’ state—and all thanks to those adorable smartphones.  

Lawrence DiStasi

Wednesday, June 29, 2016

Brexit's Tectonics

-->
Like just about everyone else on the planet, I have been trying to sort out my reactions to Brexit—the British vote in favor of exiting the European Union. And what occurred to me even at the very time I heard it was this: maybe it’s a necessary warning sign to the neoliberal-powers-that-be that globalization, not democracy, is what’s run amuck. Far from being just a protest vote against the influx of “foreigners” and migrants—in other words, the racist reaction from the great unwashed of the British lower classes—it may well be far deeper. It may be, that is, a cry of the heart from those who do not want to be homogenized in the great likeness machine of global corporatocracy that seeks to make everyone a stamped-out cog in the consumer-exploiting Walmarts of the world. That’s what occurred to me almost instantly when I heard the news. Nationhood may be anachronistic or even dangerous in this ever-more connected world, but it’s also one of the few things that has a chance of keeping different sections of the globe unique. And what we need now is more of it not less—more distinctiveness in separate populations, more distinctiveness in dress, language, buildings, customs, ways of doing and being. Skyscrapers in Dubai, no matter how marvelous the technical skill they demonstrate, simply strike one as completely out of touch with their surroundings. We need people and populations that are more in touch with their surroundings, more unique to the particular flora and fauna in which they arise.  And this may be what Brexit and the Trump phenomenon in our own country are more deeply about.
            We hear about the chaos in the financial markets: the British pound dropping like a stone, the stock market here and elsewhere dropping similarly, the financial and economic mavens predicting more and more dire outcomes from uncertainty. And though no one wants to see another financial crash, what we need to do is understand that perhaps this is precisely what’s needed to wake these guys up. Just consider what the outcome of the financial wheeling and dealing of the past few decades has led to: conditions of inequality in both Britain and the United States that are almost unprecedented. The corporate CEOs, the banksters, the hedge fund managers are making obscene amounts of money and living like oriental potentates, while the working slobs have been going steadily backwards. More and more people lose their jobs to foreign countries whose workers slave away at wages that completely obviate competition. More and more corporations rush to have their goods made in these foreign factories, shifting them whenever another country offers yet lower wages. And the gulf between the very wealthy few running things and the masses of impoverished working stiffs racing to the bottom grows ever wider. There is a professional class which manages to stay reasonably solvent—the bank managers, the professoriat, the politicos. But they hew to the party line of whoever’s in power, conservatives or laborites, democrats or republicans, and maintain their insider edge regardless of who’s got the reins (see Thomas Frank and his recent exposè of the Democratic Party in Listen, Liberal.) Meantime, those trying to catch up find themselves always deeper in debt, even the middle classes who have to incur a lifetime of debt to afford a college education. So there’s a logic to chaos in financial markets. These usurers should find themselves in chaos. They should find themselves at the bottom of a pit. But of course, they usually don’t. And those who land in the pit are the suckers who buy into the myths of progress and globalization and trade deals making everyone richer, and all the other myths about the benefits of trade we’re constantly sold.
            What occurs to me, then, is that this isn’t about politics so much as economics. And there is a difference. I, for one, am in favor of the United Nations and attempts to keep the violence of the world under reasonable control. This requires that nation states give up some of their sovereignty, which always elicits protests and anguish from the breast-beaters on the Right. But by and large, with some notable exceptions such as Israel’s continuing occupation and ethnic cleansing in Palestine, the system has worked fairly well. Invasions by one aggressive state of another’s territory have pretty much been limited—though not entirely eliminated. The condemnation attaching to naked aggression such as we saw in the 1930s and before have made such ventures too costly to most nations’ global reputations. This, again, is not to say that such aggression has been totally foreclosed, but it has, for the most part, been priced too high for most nations to incur lightly. The loss of sovereignty is worth the gain in peace (or at least accommodation).
            In the economic sphere, however, the situation is almost diametrically opposed. And it is not nations that are at issue so much as trans-national corporations. This tends to be the result mainly of trade agreements like NAFTA and the still-not-ratified TPP. In other words, in the economic sphere, it is not nation-states that are the major offenders, but corporations and the aptly-named ‘vulture capitalists’, many of which have simply transcended national boundaries. Indeed, the terms of trade agreements in a globalized world have meant that national sovereignty has become subservient to corporate rights—the right to make a profit. One example says this loud and clear: the recent lawsuit filed by TransCanada. The corporation that had planned the Keystone XL oil pipeline from Canada through the United States has just filed suit demanding $15 billion as compensation for its losses from the “expected profits” it stood to make if the deal was approved. This accords with the boilerplate agreement in such trade deals: corporations have been essentially granted the right to “expect” profits from planned ventures in any nation they choose, and if such plans come into conflict with a nation’s determination to prevent the despoliation of its territory or people, then too bad. The corporation has prior rights here—a right to sue for damages to its profit—while a nation has no right to prevent damage to its land or water or environment—or the globe itself in the case of global warming. This, to me, is about as outrageous as capitalism gets. The underlying notion is that profit is sacrosanct, and takes precedence over considerations of human health or the health of the nation and planet itself.
            This, I think, is what is really at issue in the Brexit vote and the Trump/Sanders phenomenon in the United States. The people who are being crushed by the depredations of big corporations—which pursue profit anywhere and everywhere, no matter the damage to the nations in which they reside—have begun, if only dimly, to catch on. Perhaps they don’t see beyond the slogans and xenophobia. Perhaps they can’t or won’t articulate what is really at the heart of their malaise. But on some level they understand. This is why they want to “take back their country.” What they want is some control, or someone they elect to control the monsters called corporations and financial institutions that seem able to roll over anything in their way with no consequences. What they want is some control over the obscene redistribution of wealth upwards that has taken place in the past half-century. What they want is something like fairness in the way their lives are disbursed, some more direct connection to what they know, to what they can see, rather than some distant insider decision-making that is invisible to them. What they want is some indication that their vote can have some effect on the levers of power despite the fact that they are not mega-rich. Brexit  for once gave many in England that indication. And my guess—and my hope—is that more Brexits are on the way—especially if the powers that be do not wake up to the earthquake that has just struck them.

Lawrence DiStasi