Tuesday, May 1, 2018

That Old Devil Sugar

It is surely no secret that the United States, and, increasingly, the rest of the world, is wallowing in an epidemic of diabetes (the CDC says that one of every four American teenagers has type 2 diabetes, with the expectation that by 2040, it will one in three.Teenagers!), and that many doctors and other healthcare providers consider sugar to be at least one culprit. It is therefore somewhat surprising that Gary Taubes, in his latest book, The Case Against Sugar(Knopf: 2016), takes the position that there is a serious debate, if not a war, raging about this, and that he must therefore prosecute sugar as responsible for just about every major ill we have, including cancer and Alzheimer’s. For my purposes, though, I’d just like to see agreement that sugar is clearly responsible for a few: diabetes, metabolic syndrome, heart disease, and hypertension (high blood pressure). That’s because I have been diagnosed late in life with two of those at least: type 2 diabetes, and high blood pressure. I have both more or less under control, but anything that can bring clarity to what causes either or both is welcome. 
            For Taubes, the case is crystal clear. The culprit is sugar, especially the refined sugar called sucrose (half glucose, half fructose) andthe ingredient that seems to be in just about every prepared food one buys these days, high-fructose corn syrup(55% fructose, 45% glucose). These are the empty calories that Americans and inhabitants of all advanced industrial countries have been consuming in ever greater amounts for a few hundred years (Americans in 1999 were consuming an average of 158 pounds of sugar per year). And according to several studies that Taubes cites, it is this major dietary change that accounts for the damage wrought to our bodies. Humans have simply not had time to evolve fast enough for our bodies and organs (particularly the liver and pancreas) to handle the huge increase in refined sugars we now consume. This evolutionary argument is one of the best Taubes makes. For example, he notes that yearly per capita consumption of sugar “more than quadrupled in England in the eighteenth century, from four pounds to eighteen pounds, and then more than quadrupled again in the nineteenth,” while in the United States “sugar consumption increased sixteen-fold over that same century” (42). But the real argument comes in studies done more recently on indigenous populations like American Indians and Africans who have changed to Western-style living more recently and more rapidly. Among the Indians of Arizona, for example, the Pima, along with other Native Americans, have seen diabetes rates explode from almost nothing when eating their native diet to over 50% and more now, after changing to a Western diet laced with sugar and immense quantities of soda. Another major study, Western Diseases(1981) by Denis Burkitt and Hugh Trowell on indigenous populations in Africa, saw tooth decay, gout, obesity, diabetes, and hypertension skyrocket with Westernization (229). Another study of natives from Tokenau, an island near New Zealand, saw the same phenomenon: from a diet of coconut, fish, pork, chicken, and breadfruit (a high-fat diet, Taubes notes), the Tokenau people shifted to western ways in the 1970s. Some migrated to the big island of New Zealand, while some simply stayed on Tokenau, but both adopted western ways and foods. Diabetes shot up to engulf almost 20% of the women and 11% of the men, with corresponding increases in hypertension, heart disease, and obesity. To sum up, Taubes quotes one of his heroes, John Yudkin, a University of London nutritionist, who wrote in 1963: “We now eat in two weeks the amount of sugar our ancestors of two-hundred years ago ate in a whole year” (154). 
            But many of us have heard these stories and statistics before. What is surprising and even shocking in Taubes’s arguments are the supportive and historical facts. As one might guess from his argumentative title, not everyone agrees with Taubes. In fact, for the major arbiters of American eating policy like the FDA and the NIH (National Institutes of Health), sugar has remained a kind of untouchable, on the list of GRAS (generally recognized as safe) foods until virtually the present day (a quick Google search revealed that the latest FDA recommendations concerning sugar only warn that sugar can cause tooth decay!). This contrasts with FDA decisions on the sugar substitutes saccharin and cyclamates, which, with urging from scientists sponsored by the sugar industry, have been listed as possibly carcinogenic. But the major thrust of the scientists like Ancel Keys of the University of Minnesota and Fred Stare of Harvard (both funded handsomely by the sugar industry, Taubes points out) has been twofold: first, that it is the fats in our diets that kill us prematurely in the West, via heart disease and diabetes; and second, that “we get obese or overweight because we take in more calories than we expend or excrete” (107-9). Taubes calls this “the gift that keeps on giving,” because it essentially exonerates sugar from any role in obesity or diabetes or the host of other diseases plaguing Western societies. It doesn’t make any difference what you eat, goes this mantra, because “a calorie is a calorie.” Eat too much food (too many calories), and exercise too little, and you get fat, which causes you to get diabetes and die early. Period. End of discussion. Forget about the impact of refined foods (white flour, white sugar) on one’s metabolism. 
            Taubes goes into most of the studies and white papers that chart this, to him, massive fraud, but it’s not necessary to repeat that here. Suffice it to say that the sugar industry has used its massive profits and political clout to pretty much cloud the issue of sugar’s deadly effects in much the same way the tobacco industry clouded, for years, the dangers of tobacco smoke. Indeed, one of the really surprising roles of the sugar devil is in flavoring tobacco. Yes, that’s right. Tobacco growers and cigarette makers learned around the turn of the twentieth century that tobacco in cigars and pipes wasn’t really selling enough product (getting enough people hooked). So, in 1914, R.J. Reynolds introduced Camels, “the first brand of cigarettes made of multiple tobacco types (basically flue-cured Virginia, and Burley tobacco) blended together” (64). Sugar entered in two ways. First, flue-curing the Virginia tobacco turned a natural sugar content of about 3% to one of 22% sugar. That higher sugar content makes the tobacco more inhalable, because the smoke becomes acidic, not alkaline (alkaline smoke irritates the mucous membranes and stimulates coughing, which is why pipe smokers rarely inhale). Second, the nicotine-rich Burley tobacco was “sauced” with sugars from honey, molasses, licorice and so on. This was first done for chewing tobacco, but Reynolds put it in Camels, and this really did the trick. Why? Because by blending “sauced” Burley tobacco with already-sweetened flue-cured Virginia, Camels were able to deliver a sweet-tasting and -smelling cigarette that was easier to inhale, and thus “maximized the delivery of nicotine—and carcinogens—to the human lungs” (69). Most other tobacco companies and brands wasted no time following Camels to the point that “by 1929, U.S. tobacco growers were saucing Burley tobacco with 50 million pounds of sugar a year” (69). That this is not fiction is evidenced by Taubes’s quoting of a 2006 report from the Netherlands: “Consumer acceptance of cigarette mainstream smoke [what’s directly inhaled] is proportional to the sugar level of the tobacco” (70). In other words, sugar’s key role in making cigarettes palatable (inhalable) to both men and women worldwide contributed in a major way to the epidemic of lung cancers that are still with us. 
            Taubes has endless data on the role of sugar in diseases, but a couple stand out. First, he cites studies showing that sugar may well be addictive (as anyone with children knows). It elicits a response in the brain’s “reward center,” e.g. the nucleus accumbens, that closely resembles the response from nicotine, cocaine, heroin, and alcohol. This may help explain why its rise has been so spectacular in every society where it was introduced. More important, to me at least, is the connection Taubes tries to make between sugar and what is now generally conceded to be a key factor in diabetes and a host of other diseases that cluster with it—insulin resistance. Insulin, of course, is the hormone produced in the pancreas that acts to combat high blood sugar. Basically, when blood sugar (glucose) levels rise, the pancreas responds by secreting insulin, which “signals the muscle cells to take up and burn more glucose.” Insulin also induces cells to store some of the glucose as fat (insulin is said to be “lipogenic” or fat-forming). Then, when blood glucose falls, the insulin level falls too, and the stored fat can be burned instead of glucose. The problem is that some people (and increasing numbers in Western societies) exhibit a condition known as “insulin resistance”: their cells do not accept the insulin, and hence the glucose levels in the blood, and the levels of insulin, remain high or get higher. 
The question is: what causes insulin resistance in the first place? 
This would appear to be the 64-million-dollar question (the fight over this is quite active even today, with one side blaming the “drop in insulin sensitivity” on a fat, “intramyocellular lipid” said to block cell receptors from accepting insulin; and another blaming insulin resistance on excess carbohydrate [especially sugar] consumption and too much insulin production). Oddly, Taubes himself does not give a conclusive answer. He strongly suggests that insulin resistance is caused by too much sugar consumption, but he hedges. He says that in contrast to the sugar industry’s old mantra that insulin resistance and diabetes are caused by obesity, i.e. eating too much or consuming too many calories, another possibility 

is that these elevated levels of insulin and the insulin resistance itself were caused by the carbohydrate content of our diets, and perhaps sugar in particular. Insulin is secreted in response to rising blood sugar, and rising blood sugar is a response to a carbohydrate-rich meal. That somehow this system could be dysregulated such that too much insulin was being secreted and that this was causing excessive lipogenesis—fat formation—was a simple hypothesis to explain a simple observation. (120-1). 

But because insulin resistance is so important as a causal factor in diabetes, obesity, metabolic syndrome, heart disease, and so on, we would like something firmer. Taubes doesn’t give it to us, partly, he explains, because the NIH decided long ago not to fund long-term studies on the effects of sugar in the American diet because it would cost too much. It would also take a long time, since diabetes does not manifest overnight; like tobacco, sugar’s effects usually take years to manifest. But still, we want more. And Taubes does cite one study from Switzerland that seems to provide some scientific proof. Luc Tappy of the University of Lausanne studied fructose (fructose is metabolized without the need of insulin) in the mid-1980s. What his study found is crucial, even though it was short-term:
When Tappy fed his human subjects the equivalent of the fructose in 8 to 10 cans of Coke or Pepsi a day—a “pretty high dose” as he says—their livers would start to become insulin-resistant and their triglycerides would elevate in just a few days. With lower doses, the same effects would appear but only if the experiment ran for a month or more (205). 

This would seem to be a sound study, indicating insulin resistance and elevated triglycerides, both key markers for diabetes and for heart disease, from excess sugar. But again, its effects derived only from the use of fructose—though that is the main ingredient in most sugared drinks and lots more, via that demon invention, high-fructose corn syrup. 
            Unfortunately, this is as much as Taubes provides concerning insulin resistance. And that is a shame. 
            Still, Taubes’s book is well worth reading, if only for the disgraceful history it unfolds, a history of sugar that has scarred this nation from its very beginnings (slavery and sugar are intimately intertwined) and which still, in 2018, thanks to the power of the sugar industry in shielding its product from blame, continues its devilish work. 

Lawrence DiStasi

Tuesday, April 24, 2018

Opioid Crisis

I probably wouldn’t have read Dreamlandif my daughter and son-in-law hadn’t given it to me when we had Easter dinner. After all, there has been nonstop coverage of the overdose deaths plaguing white America for several years, and I pretty much thought I knew it well enough—well enough to be disgusted that the attitude of official America had softened dramatically from the “get tough on drugs” approach that had locked up millions of urban African Americans to the “we need understanding and medical treatment” that had prevailed when the addict ghetto suddenly became white and suburban. But they did and I did, and now I think everyone else should—read the prizewinning 2015 book by Sam Quinones titled Dreamland, that is. It really is a gripping and infuriating tale of malfeasance, first by a medical community that adopted prescription opiates for pain with missionary-like blinders; then by the health care establishment that refused to pay for the full treatment for pain (psychologists, social workers etc.) that pain clinics recommended but only for pills; then by the pill-pushing pharmaceutical companies that cashed in on the bonanza without a care for the horror they were perpetrating but only for their profit; and finally by an American public whom success had schooled to expect quick technological fixes, especially for pain, any pain at all which, they thought, every American had a right to not ever suffer. All this, and more, is part of the true tale that Sam Quinones tells. It is essentially a tale with two major strands: in one, unscrupulous doctors operating “pill mills” write thousands of opioid prescriptions for people with nothing but a claim of pain (back pain, knee pain, any kind of pain), thus creating millions of addicts in suburban towns where none had existed before; and in the other, young men from a small area in western Mexico called Nayarit discover that the black tar heroin they can make from poppies nearby can be peddled in an entirely new, safe, and convenient way in small cities and suburbs in America.  
            Quinones begins his tale with the story from which his title comes, Dreamland. It was the name of a community pool in Portsmouth, an idyllic center of safety and pleasure for a suburban town in southern Ohio that used to be a main U.S. manufacturing center for shoes and shoelaces. Like other small Midwest factory towns, though, outsourcing to foreign countries with cheap labor killed the manufacturing, and killed Dreamland along with it. And it was in Portsmouth Ohio, along with nearby towns in Kentucky and West Virginia, that the opiate epidemic found a dispirited and dead-end population eager to be seduced out of its existential pain. 
            To understand how this could happen, Quinones takes us through the growth of the American pain revolution. He starts with the discovery of heroin in Germany in 1898 (the drug was actually first synthesized in London in 1874 as diacetylmorphine, but the Bayer chemist named Dreser who duplicated it in Germany called it heroin—from heroisch, German for “heroic.”) Though heroin was first believed to be non-addictive, regulators soon learned better, and its use in the United States was strictly controlled starting in 1914. Still, there continued to be people, particularly in big cities, who got hooked, and one of the treatments for such addicts was a substitute drug called methadone—with methadone clinics maintaining addicts in cities all across the country. The problem with these clinics was that they tried to limit addicts to doses of methadone that were too small, so addicts had to find their dope fix elsewhere. Now shift to the Nayarit-based (the municipality was Xalisco) dealers of black tar heroin, a form that is much less purified than regular street heroin, but also far more powerful—delivering 80 percent heroin compared to the usual street version at 10 or 15 percent. As Quinones puts it, the Xalisco Boys “discovered that methadone clinics were, in effect, game preserves” which provided one of their first footholds in American cities. 
But then the pain revolution happened. It started in the 80s with well-meaning doctors seeking to provide mostly terminal cancer patients with pain relief. They called it ‘palliative care,’ essentially figuring that risking addiction ranked low on the ladder for someone who was dying soon anyway. This work was ‘helped’ when, in 1984, Purdue Frederick Pharmaceuticals came out with MS Contin, a timed-release morphine pill that would soon become OxyContin. Fitting perfectly with pain-management, the new pills “seemed far less addictive because through their timed-release formulas they eased relief out to the patient over many hours” (85); that is, addicts couldn’t get the big “rush” from a normal morphine hit. At about the same time, pain management organizations promoted the idea that “America was undertreating pain,” all kinds of pain, any kind of pain. State lawmakers followed suit, passing laws “exempting doctors from prosecution if they prescribed opiates for pain.” Things then grew like Topsy, with patients soon getting used to demanding drugs for pain (my own doctor in Pt Reyes, CA told me that to this day, even with all that is known about the opioid epidemic, his clinic sees patients literally demandingpain drugs from doctors); and, more important, doctors becoming convinced that these morphine-based painkillers were virtually non-addictivewhen used to treat pain. This latter idea stemmed from an alleged “report” in the New England Journal of Medicine by a doctor named Hershel Jick, which claimed that “less than one percent of patients treated with narcotics developed addictions to them.” Drug companies like Purdue producing timed-release opiates seized on this alleged “landmark report,” turned it into a “scientific study,” and promoted their pills as non-addictive, with most doctors quick to believe them. In truth, of course, the Porter & Jick “study” was merely a “one-paragraph letter to the editor” that commented specifically on the use of opiates with cancer patients under strict hospital supervisionwhere the danger of over-use was almost non-existent. But no one bothered to look up the research (there was none), and Purdue Pharma aggressively promoted the use of OxyContin for all kinds of pain relief, at the same time ridiculing the fear of patient addiction as “basically unwarranted.” 
Now what you had in places like Portsmouth, OH were hordes of people looking for relief from their pain (from back pain to sports pain to post-operative pain to existential pain), and just as suddenly, scores of unscrupulous doctors setting themselves up in pain clinics to dispense prescriptions for OxyContin to anyone claiming to be in pain. Any kind of pain. All pain. The result was long lines outside these pain clinics, and beyond that, many now-unemployed people throughout Appalachia signing up for SSI (Supplemental Security Income), because with it they could get a Medicaid card that paid for their pain prescriptions. This Medicaid scam actually became a business for many; they would get their prescriptions filled for a $3 copay and sell the pills to those who had run out for up to $10,000 on the street. Some bright entrepreneurs actually took to driving carloads of pill seekers long distances to cheaper pill mills in Florida and elsewhere—anywhere that pills could be procured more cheaply—for a nice profit. By 2002, according to Quinones, OxyContin prescriptions for pain rose from 670,000 in 1997 to 6.2 million five years later. And of course, contrary to the claims by Purdue Pharma that the pills couldn’t be misused, addicts soon learned to crush the Oxy pill and snort it, or even liquefy it and inject it, thus “obtaining all twelve hours’ worth of oxycodone at once” (138). 
The stage was now set for the Xalisco boys to expand their range to those towns outside the major urban centers, especially in the middle of the country, that had a growing supply of opiate users. And they did, moving operators to Portland, Cincinnnati, Las Vegas, Portsmouth, Charlotte, Nashville, and everywhere in between. Their appeal was, first, that their prices for their black tar heroin were so much cheaper than prescription pills (which could cost as much as $30 a pill). As Quinones puts it, 
by 2011, you could buy fifteen balloons of potent Xalisco black tar heroin for a hundred dollars— $6.50 a dose, about the price of a pack of Marlboros—in Charlotte, NC, a town that barely knew the drug a few years before (229).

The second appeal was the delivery system. The Xalisco boys had early discovered that it was easy, in the suburbs, to set up personal delivery to addicts. All an addict had to do was call a cell phone number; the driver would deliver it in minutes to an agreed-upon place like a mall parking lot. And since the Mexican drivers were trained not to carry either weapons or excess dope but only five or six balloons with pellets of heroin inside that they carried in their mouths, they were almost never caught carrying heroin. If police came, they would simply swallow the balloons. Even when caught, their punishment was usually simple deportation, and another driver would soon replace them. The third appeal, of course, was the potency of their product. As noted above, black tar heroin was up to 80% pure. This was a boon at first, but then it had its negative effect: overdoses by addicts who were used to a far-less-potent hit. 
            It was the overdose deaths that began to alarm officials throughout the nation. Two government workers in Olympia WA, Jamie Mai and Dr. Gary Franklin, began to investigate the overdose deaths in their state and found they were increasing by alarming rates, and in tandem with the amount of opiates being prescribed, or rather “overprescribed.” When they published their results recommending strict prescribing limits for opiates, though, they were not greeted as heroes; rather Mai and Franklin were sued by a doctor named Merle Janes alleging that their opiate guidelines exhibited “an extreme anti-opioid discriminatory animus or zealotry known as Opiophobia” (234) that would corrupt the management of public health policy. Fortunately, Dr. Janes’s lawsuit was thrown out of court in 2011, and Washington state issued its guidelines, soon followed by several other states. But it had taken a long time, perhaps too long. In 2005, for example, an epidemiologist in the Ohio Department of Health noticed that not only were poisoning deaths in Ohio increasing alarmingly, but that most of them were drug overdoses. By 2007, further analysis revealed that drug overdose deaths in Ohio were about to, and did in 2008, “surpass fatal auto crashes as Ohio’s top cause of injury death” (249). This, again, corresponded with the amount of Oxycodone dispensed, which rose by almost 1,000 percent in Ohio between 1999 and 2008. Put another way, “the number of Ohioans who died from overdoses between 2003 and 2008 was 50% higher than the number of U.S. soldiers who died in the entire Iraq War” (250). 
            There is much more detail in this astonishing and infuriating book, but to bring it up to date, here are some facts about the state of the opioid crisis in the United States as of 2018. The U.S. is still, by far, “the leading prescriber of opioids in the world” (this and all subsequent quotes from Vox.com, 3.6.18, Ella Nielsen, “America’s opioid crisis has become an ‘epidemic of epidemics’”). Driven by the drugs named above, plus the powerful heroin synthetic, fentanyl, “drug overdoses claimed 64,000 lives in 2016 alone, more than the entire death toll during the Vietnam War.” And finally, because of intravenous drug use, “there are 40,000 to 50,000 new cases of bacterial endocarditis (infection from re-using unclean needles) in the U.S. each year” which cost “more than $120,000” apiece to treat. When added to the new cases of Hepatitis C from the same source, this amounts to what researchers now call a “syndemic,” or “multiple diseases feeding off one another.” The two Washington State researchers, Jaymie Mai and Gary Franklin put this another way, calling the opioid crisis “the worst man-made epidemic in history, made by organized medicine” (Quinones, 310). 
            One has to agree with this, and with Quinones, when he underscores Martin Adler’s (professor of pharmacology at Temple U) take on morphine as “a great metaphor for life.” Here’s how Adler puts it, referring to the perversion that prevailed for a time in the idea that pain was somehow to be exiled from human experience:

“The bad effects of morphine act to minimize the use of the drug, which is a good thing. There are people born without pain receptors. [Living without pain] is a horrible thing. They die young because pain is the greatest signaling mechanism we have” (313). 

To attempt to take away that signaling mechanism, as American medicine tried to do, was more than a terrible blunder; it was a fundamental misunderstanding of life and the human organism— how it works and how it is designed to work. 

Lawrence DiStasi


Tuesday, April 17, 2018

Dear Facebook

Like everyone else, I followed the highlighted testimony of Mark “the zombie-shark” Zuckerberg before Congress. And what I have decided is that there is simply no way these creeps are going to desist from their surveillance of me and billions of others—because it’s their business. Facebook followers like myself sign up so we can see the pictures and doings of our relatives, mostly the kids, and in doing so unintentionally allow the Facebook creeps to harvest all our information and preferences so they can sell it to advertisers who want to target their ads to us. In short, the business isn’t about connecting people to each other; it’s not a “service” as Zuck likes to say; it’s about spying on everyone everywhere to get their information and preferences, so the data can be sold for billions to the greedy corporate bastards who want to precisely target customers (and apparently to Russkies who want to target potential voters). If they know I’m interested in cars, they can send me ads for new cars. If they know I’m interested in underwear, they can send me ads for underwear. I’ve had this happen recently: I looked on a couple of websites for packages of underwear to redeem a gift card, and haven’t stopped getting underwear ads on every site I go to. At one point recently, I was pissed off at a rise in my car insurance rates, and so checked out a site that promised cheap car insurance for low mileage drivers. I’ve since been besieged daily with teasers for “Ford owner in Bolinas” trying to lure me to check out a great insurance rate (which turns out the same each time: they feed me to some company who asks for more information which feeds me to someone else for the same routine, but never, ever do I get an actual rate quote). 
            So here’s what I’ve decided. Ok. You’ve got my browsing records. You’ve got my “likes” and the articles I’ve seen posted by others on my Facebook feed that I’ve re-posted. You’ve seen my blogs that I’ve posted, and articles I think are worth posting, so you know what I like, or think you do. So to save you time (with the saved time, you’ll have ample time to shove my preferences up your blowhole), I’ve decided to put it all out there and let you pursue me with your best shot.
            --I need underwear every once in a while. I like boxer briefs.
            --I need socks periodically, though my sister bought me enough one recent Christmas to last me for years. 
            --I need gasoline for the little driving I do, but you can’t sell that over the internet.
            --I read political articles on some sites you may not like much: Truthout, Common Dreams, Consortium News, Counterpunch, Reader Supported News, Nation of Change, Daily Kos, and others of that lefty ilk. I also read anything I can get my hands on that shows why this nation is going into the toilet, and how we might hasten its demise, especially its current economic system, to bring about a more equitable distribution of wealth, health and happiness. That includes revolutionary change of the kind that doesn’t involve buying lots of useless shit from big box stores, or going out to stupid restaurants that prepare tasteless, nutritionally disastrous meals; quite the contrary: it involves, quite precisely, refraining from the great American pastime of buying dumb shit or wasting time on websites that allow us to observe others posting about the dumb shit they’ve just bought or eaten, and instead procuring only what one needs, when one needs it. Taking pride, in fact, in not buying useless products that only help destroy our planet. 
            --I have decided that keeping up with the news from my relatives may make it worthwhile to maintain my Facebook account, at least for a time, but know that you can spy all you want and target me with dumb commercials all you like, but I’m not about to buy a damn thing. Not a damn thing.  
            So spy away, Zuckerberg and company. These eyes refuse to buy or even notice what you’re selling, nor do I have even the remotest intention of getting a smartphone or any apps that make it easier to tune in and buy said dumb shit. All I might buy are books, and I don’t like reading books on screen in any case; I like the old-fashioned paper kind you can cozy up with, which is also what I like to write. 
            In closing, let me summarize what I really mean to say, by saying to Facebook what my Great Aunt Zi’Carmela would’ve said: a’ fottuta

Lawrence DiStasi

Friday, March 30, 2018

Good Friday

I woke this morning rather early, for my second pee, with dawn’s light just beginning to seep in from the east, when, glimpsing through my bathroom window to the west, I saw something I don’t remember seeing before: a full moon, already about a third obscured, setting in the west. It seemed uncanny to me. I had never seen a moon-set before, at least not that I remember. I’ve seen a quarter or half-moon in the western sky high up, while the sun rose in the east, but never this—a full moon looking almost like the sun setting, only it was this bright, glowing moon. Later, I saw an item in my Bing news feed: that this was a special moon in another sense; it was a “blue moon,” the second full moon in a single calendar month. And, a bit later, that it was also a “paschal” moon—the moon around which Passover, and therefore the Easter that always follows the Jewish holiday (since Jesus was a Jew and celebrated his Last Supper in honor of Passover), also revolves. 
            Of course, it’s not Easter yet. It’s Good Friday. And that name also gives me pause. I mean in what sense is the gruesome crucifixion that Good Friday celebrates “good?” Why do Christians celebrate (as many adherents of other religions often ask) the brutal torture and killing of their God? Why have they made it the central symbol of their religion? Why do religious people wear crosses or even crucifixes (metal images of their dead God hung from the cross) around their necks? 
            I haven’t thought about these things for many years, mainly because I have been practicing Buddhism since my thirties and Christianity has seemed mostly irrelevant. But of course, like many others, I still mark Christian holidays by attending special family dinners, featuring traditional meals that I love. And periodically I write a blog about one of the holidays and the philosophy behind it. As I’m doing here. What is this Good Friday thing? Does it depend on knowing that the dead God, Jesus, will rise again on Easter Sunday (implying that we, too, will survive our death)? Is that why the Friday of his death is “good?” Or is the Friday of the God-slaying “good” even aside from the joy of the resurrection? 
            If I remember correctly, there is something “good” about the death of Jesus, about his sacrifice, because he is thought to be redeeming all Christians (and perhaps all of humanity) thereby. His death is an atonement, a sacrifice to his Father, the big Macher, for the “sins” of humanity. And because he is believed to be no less than God himself, this sacrifice, this death of God himself, is sufficient to make up for, to atone for the horrible sin(s) of humans. And that, of course, raises the key question: what have we humans done that we require such an awful atonement? 
            If we follow orthodoxy and the Bible, we would have to say that humans, in the persons of Adam and Eve, sinned against God by eating of the forbidden fruit. As Milton put it in Paradise Lost: “Of Mans First Disobedience, and the Fruit/ Of that Forbidden Tree, whose mortal taste/ Brought Death into the World, and all our woe…” So there it is: this flouting of God’s command in Paradise set the stage for the coming of Jesus to perform his sacrifice and redeem us all from what Roman Catholics call “original sin.”  But clearly, we’re not just talking about an apple here (or even a pomegranate). The Bible is referring to something more critical, more outrageous. It is referring to some deep transgression, a crime so unforgiveable that only the death of God’s son could atone for it.What could that be? 
            Given the symbolism, we might reason, it had to do with eating—since eating the fruit is what causes the problem in the first place. And this reminds me of the first novel I ever wrote, still unpublished, called Eat. It concerns a character who decides that human life disgusts him so much that he will stop eating. Stop eating altogether. Not a fast for some political reason, but simply as a sign of his disgust with the human situation. It was based on some fasting done by a student of mine when I was teaching at a small college in Gettysburg Pennsylvania, and I found it so fascinating, especially for an Italian, that I turned it into a novel. The point here is that the act of eating itself, no matter how we try to sanitize it, brings with it all kinds of ethical problems. If we eat meat, we are essentially eating the dead bodies of animals. And that requires that we take life to sustain life, that we kill something living to survive. And even if we limit our consumption to vegetable products, we are still taking vegetal life from plants that would prefer to proliferate on their own. There is simply no getting away from it: Life lives on other life. Living things survive by killing and consuming other living things. And so, if one wanted to live a perfectly blameless life, one would have to find some way to survive on air, or light, like the angels. This is the dream behind religion, not to mention the veganism, so popular in our time: there must be a way to survive without killing other beings. 
            If we were to expand this further, we could easily see that our current predicament, in this twenty-first century of the Christian era, stems from related problems. Humans have turned out to be the most lethal creatures ever to roam this earth. We have, anthropologists tell us, been directly responsible for the extinction of countless species—for example, several species of large mammals that once roamed the North American continent. Currently, our food production practices (what we do to raise cattle, pigs, fowl, and even fish on farms) are causing endless mayhem to the natural world. To provide grain for these animals, we have increasingly destroyed forests that are the literal lungs of our planet. The excrement from our farm animals is a major source of pollution, including methane, one of the most potent of greenhouse gases. And our industrial fishing of the oceans has depleted fish stocks to nearly the point of no return. The industry and population explosion made possible by this food system has led to global warming and what is turning out to be the greatest mass extinction of other species in history. 
            So, yes, our eating practices, our virtually infinite appetite for flesh to sustain us, could well be said to constitute our original sin. The flaw in the very biological construct that we are. But isn’t it the case that we are simply following the pattern laid down by evolution itself? Aren’t we simply doing what all living creatures do, and must do—consume each other? We are. And here we come to the real flaw. The root flaw, as I reflect on it, is our inability, our refusal to accept who and what we are. Our determination to make of ourselves, the species homo sapiens, something special. Something out of the ordinary. Something separate and unequal. Something that can somehow, no matter the cost, circumvent the necessity of our being. And so we invent gods, and sons of gods, who come into existence to save us from ourselves. Save us from being human. Jesus on the cross redeems us—and thereby makes it ok to do what is repulsive—consume the flesh of other beings. Then and thereby, we become humans who are above and beyond all other creatures. We are the special creature, the one that god saved by sending us his only son to die for us. And we can comfortably go on despoiling the earth and its creatures because we have been given that dispensation. We have been given Good Friday. We have been given redemption. We have, essentially, been given the earth to do with, to despoil as we will. 
            I think it is clear what the problem is here. What the “good” in Good Friday means to signify. And what we must all do, somehow, to move beyond our root refusal to accept our aliveness, must involve our human-ness and the responsibility that comes with it. The responsibility, I mean, to find a way to embrace our kinship, our continuation with all of creation that can keep us from consciously or unconsciously destroying it. An ability to somehow consume what we must consume with some sort of reverence. Some sort of restraint. Some sort of appreciation. Some species of willingness to allow all else to do the same in the precious and precarious balance all have been given. 
Even when it means, as it inevitably must, that we ourselves are the consumed. 

Lawrence DiStasi

Saturday, March 10, 2018

Kim, the Trumpster and Denuclearization

World news outlets were rocked this week by Donald Trump’s alleged agreement (the news was actually broken, rather strangely, by the South Korean delegation that brought the North Korean’s offer to the White House) to hold a meeting with North Korea’s leader, Kim Jong Un. Given Trump’s belligerent rhetoric of only a few months ago in which he threatened to rain “fire and fury” upon the North Koreans, it was surprising to say the least. But perhaps it was really not so surprising after all. For Trump clearly sees in this nuclear summit the prospect of great headlines throughout the world—and if there is one thing this global narcissist lives for, it’s headlines announcing a “first” for him (not to mention getting other less savory news off the front pages). Whether he’ll be able to pull something real out his toupe´ is something else again. For as usual, the headlines mistake what Kim Jong Un has offered. He did NOT say he was willing to give up his nukes. He said he is willing to discuss the “denuclearization of the peninsula.”  What does that mean? Not exactly clear. But one thing is certain: Kim Jong Un, like his father and grandfather before him, wants to get out from under the American threat, both nuclear and otherwise. He wants to get a formal end to the Korean war (not just a cease-fire). He wants a halt to the joint military exercises put on twice a year by American and South Korean forces, including simulated nuclear strikes. He wants an end to punishing sanctions. So when he says, “denuclearization,” if he means it at all, he’s talking more about getting rid of American nukes (including those at the disposal of South Korea) than getting rid of his own. And he may just figure that if he can get Trump committed to a huge summit meeting, the president may be so rash as to agree to something that would be a huge victory for Korea. Already, in fact, Kim has gone a long way toward getting what he has long sought: using his nuclear testing to induce the United States to recognize him and his country as equal enough to deserve a summit meeting.
            But enough speculation. Here I would like to simply provide the background I adduced way back in September when the explosions from both sides were giving the world daily agita. This is important to keep in mind when, as no doubt will happen, our media outlets drum up the heart-stopping “US v THEM” dramatics should a summit actually take place. Hence, this blog, first posted on September 5, 2017. It’s titled, ‘Who’s ‘Begging for War?’

As North Korea ups the ante once again, this time with a massive nuclear blast that some observers (and the North Koreans themselves) are calling a hydrogen bomb, the rhetoric coming out of the Trump administration gets more belligerent by the minute. As I noted in a previous blog, the two adolescent leaders—Kim Jong Un of North Korea and Donald Trump of the U.S.—are engaged in a pissing contest. ‘My dick is bigger than yours; look how far my piss goes.’ Only it’s not piss that’s being compared; it’s weapons of such massive destructive power that most rational humans shudder to even contemplate their use. But not the Donald. “What’s a nuke for, if we can’t use it?” he once said. Most recently, Nikki Haley (who, before becoming our UN Ambassador, seemed semi-rational) has been uttering nutter phrases like “We have kicked the can down the road enough. There is no more road left…” No more road for diplomacy, is what she seems to mean, especially considering that she also said Kim Jong Un is “begging for war.” The President himself tweeted much the same thing, berating the South Koreans for their “talk of appeasement with North Korea,” which will “not work, they only understand one thing.” That is, violence of the nuclear variety.
            And as we all look on in horror as nuclear Armageddon looms ever closer, we have to ask: Just who is it that’s begging for war? Can the world really believe that North Korea, a nation of 24 million people whose economy seems permanently hobbled, and whose military, while large, would be no match for that of the United States and South Korea combined (the South itself may have nukes in its huge military arsenal supplied by the United States), actually wants a war? Or is it rather Donald Trump—he whose administration has lurched from one failure to another without a single legislative victory, with an approval rating that’s the lowest of any president in modern history—who is really searching for a ‘wag-the-dog’ solution to distract us all from his mounting problems?
            To really probe this question, especially the one concerning what exactly Kim Jong Un thinks he’s doing with his rockets and nukes, we need to know a bit about history (which most Americans, especially their idiot president, do not). My source is an article that appeared on consortiumnews.com last week: “How History Explains the Korean Crisis,” by distinguished historian William R. Polk. In it, Polk makes sense of North Korean belligerence by detailing the long history of invasions Koreans have suffered, starting in 1592 when Japan invaded and controlled the country for a decade or so. The Japanese invaded again in 1894, and this time set up a ‘friendly government,’ thereby ruling Korea for the next thirty-five years. It was in this period that many Koreans fled the country, including Syngman Rhee (the first president of the South) who fled to America, and Kim Il Sung (the first leader of the North) who fled to Russia-influenced Manchuria, where he joined the Communist party. By WWII Japan had reduced many Koreans to virtual slaves (thousands of Korean women became “comfort women” or concubines for the Japanese Army). But what’s fascinating to me is what happened to some of those Koreans who became rulers in the post-WWII period. Syngman Rhee, long resident in the United States and ‘Americanized’ (not to say ‘Christianized’), was set up as the first president of the new South Korea (North and South were vaguely established by the UN in 1945, but Rhee officially became the South’s ‘president’ with American help in August of 1948). He ruled basically as a U.S. puppet, with the United States military assuring his continuance by sending thousands of U.S. troops to support him, and American industry assuring the economic rise of his part of the country. According to William Polk, “Syngman Rhee’s government imposed martial law, altered the constitution, rigged elections, opened fire on demonstrators and even executed leaders of the opposing party.” His successor (via a military coup in 1961), Park Chung Hee, spent the war in Korea, but by collaborating with the Japanese occupiers (he apparently even changed his name to a Japanese one). His rule as President was so vicious that he, too, was overthrown (assassinated by his intelligence chief, 1979) and replaced first by Choi Kuy-hah, who was then deposed in a military coup by Gen. Chun Doo-hwan, who himself immediately imposed martial law that closed universities, banned political activities and throttled the press. A book I’ve read recently, Human Acts by renowned Korean novelist Han Kang, dramatizes the university and high school protests of 1980 in Gwangju that were savagely put down by Chun Doo-hwan, who ordered soldiers to coldly shoot over 600 young protesters and bury them in mass graves.
            By contrast, Kim Il Sung, who was to become the leader of North Korea after WWII for no less than fifty years, spent the war years as a guerilla fighter influenced by the Russians, where he led the resistance to the Japanese occupiers. His status as a hero was established there, and he soon became the first Prime Minister of the North, which he declared as a state in 1948, with ambitions to reunite North and South (Syngman Rhee had announced the same intention, as ‘reunification’). But when Rhee declared that the South was a fully independent state, Kim Il Sung saw it as an act of war, and (once China had agreed to take responsibility for the outcome) ordered his army to invade the South. Far better equipped and motivated than the southerners, Sung’s army took possession of Seoul, the South’s capital city, within three days, on June 28, 1950. By this time, the U.S. had persuaded the UN Security Council to protect the South, and organized 21 countries to send troops (though Americans made up the bulk of the forces in what was called a “police action”). Still, Sung’s military drove the southern army all the way south to the city of Pusan, where, by August, the southern army “held only a tenth of what had been the Republic of Korea.”
Here the situation was saved for the South only by the brilliant counterattack led by General Douglas MacArthur, who made a storied landing at Inchon, where, behind enemy lines, the Americans were able to cut off the Northern army from its bases. That led to a further attack by the South, which retook Seoul, and then moved across the 38th parallel (the dividing line between North and South) and drove nearly to the Chinese frontier. This brought China into the conflict, and, with what it called a 300,000 man “Volunteer Army,” overwhelmed the South Koreans and drove the Americans out of the North. At this point, General MacArthur urged President Truman to use fifty nuclear weapons to stop the Chinese, but Truman instead replaced MacArthur and continued the more or less conventional war. Except that it was not at all conventional for the North. U.S. carpet-bombing devastated the North with more tonnage (including chemical weapons) than had been used against the Japanese in all of WWII. Analysts today estimate that the North lost 33% of its population through this bombing—one of every three North Koreans perished. As Polk puts it, Korea proportionally suffered roughly 30 times as many people killed in 37 months of American carpet-bombing as these other countries (Britain, France, China and the U.S.) lost in all the years of the Second World War.” This may help explain why North Koreans generally favor their government’s stance to repel invaders at all costs: most have experienced the utter devastation of war firsthand.
            Finally, the North agreed to negotiate a cease-fire to end the stalemate (the state of war between North and South still exists), with the country divided at its 38th parallel by a demilitarized zone to keep the armies separate and to keep ‘new’ (i.e. nuclear) weapons out of the peninsula. Unfortunately, the United States, in 1957, violated article 13(d) of the agreement:

In June 1957, the U.S. informed the North Koreans that it would no longer abide by Paragraph 13(d) of the armistice agreement that forbade the introduction of new weapons. A few months later, in January 1958, it set up nuclear-tipped missiles capable of reaching Moscow and Peking. The U.S. kept them there until 1991. It wanted to reintroduce them in 2013 but the then South Korean Prime Minister Chung Hong-won refused (Polk op. cit.).

Thus, we see that it was the United States that decided to introduce nuclear weapons to the Korean conflict. But what about the North Koreans and their nukes? Even here, Polk points out, both the South and North had agreed to abide by the Nuclear Non-Proliferation agreement (1975 and 1985 respectively), but both violated the agreement (South Korea covertly from 1982 to 2000; North Korea in 1993, withdrawing totally in 2003.) Polk also adds that the precipitating event for the North’s withdrawal and its underground testing begun in 2006, was George W. Bush’s January 2002

Axis of Evil speech, in which he demonized North Korea. Thereafter, North Korea withdrew from the 1992 agreement with the South to ban nuclear weapons and announced that it had enough weapons-grade plutonium to make about 5 or 6 nuclear weapons (Polk op. cit.).

            This brings us to today. As noted in a recent article (Mel Gurtov, “Echoes of Reagan: Another Nuclear Buildup,” commondreams.org: 9.3.2017), the United States currently has about 6,800 nuclear weapons (roughly 1,400 strategic weapons deployed, the rest stockpiled or retired). Among these, the 920 missile-launched nuclear warheads deployed on 230 invulnerable submarines, are alone “enough to destroy an entire country and bring on nuclear winter.” By comparison, North Korea may have about a dozen nuclear weapons (some analysts say they could have as many as 60), most of them about the size of the ‘paltry’ nukes that devastated Hiroshima and Nagasaki. It also has the still fairly rudimentary missiles it has been launching with frequency this year, very few with the ability to strike the United States or anywhere near it. So what is this business about the North “begging for war?” It is an absurdity. What the North is really after is simple: survival; a resolution of the South-North war, which has been ongoing since the armistice in July 1953; and, related to that, an end to the huge and provocative war games that have been carried on for the last three weeks. These ‘games’ go on twice a year, and are clearly designed to threaten the North by simulating an invasion of North Korea and a “decapitation” operation to remove Kim Jong Un. What would the United States do if Russia were to carry on war games from Cuba or Mexico? We already know the answer to that. Yet despite the continuing pleas of the North to the U.S. and South Korea to cease these provocative military exercises, the U.S. and its protégé have persisted and even expanded them ever since the end of active fighting. In addition to these regular war games, recently the United States has sent groups of F-35B fighters, F-15 fighters and B-1B bombers on military operations over a training range near Seoul, where they dropped their dummy bombs to simulate a nuclear strike. According to Mike Whitney (counterpunch.org, 9.4.2017, “What the Media isn’t telling you about North Korea’s Missile Tests”),
The show of force was intended to send a message to Pyongyang that Washington is unhappy with the North’s ballistic missile testing project and is prepared to use nuclear weapons against the North if it fails to heed Washington’s diktats.

That’s it exactly. For this is the way the world according to American empire works: we can hold threatening war games, we can surround you with nukes from submarines and bombers and missile launchers, we can insult you and threaten you and starve you and humiliate you and refuse to end our war against you, but if you dare to stand up to our bullying, we will destroy you. And it’s your own fault for defying our ‘rule of law.’
            But of course, the North sees through this. Kim Jong Un may be a clown in a funny haircut who’s trying to prove he’s a big boy now, but he’s no dummy. His nuclear response to the threats from the United States, when considering his people’s history, and his knowledge of recent history, is perfectly rational. As Mike Whitney points out, “Kim has no choice but to stand firm. If he shows any sign of weakness, he knows he’s going to end up like Saddam and Gaddafi.” To remind you, Muammar Gaddafi of Libya finally decided to take the West at its word, and give up his nuclear plans. He was thereafter the victim of an invasion by western powers and ended up publicly violated in a gruesome death, mocked by American leaders like Hillary Clinton: “We came, we saw, he died.” Ditto Saddam Hussein of Iraq, whose country is in ruins. Kim Jong Un would clearly like to avoid that fate. He and his people would like to avoid being bombed back into the Stone Age, again. And so they are gambling that the blustering primate in Washington will either run true to his cowardly form, or be persuaded by calmer and more rational minds to see if there might not be an opening for negotiations. In fact, we all have to hope that this is the case. China and Russia also hope that this is the case, proposing once again (as they did in March and often before that) that in exchange for a halt to the military exercises by American and South Korean forces, North Korea could be persuaded to freeze its nuclear and missile programs. Surely, there is the germ for a diplomatic agreement here. Even South Korea’s new president, Moon Jae-in has just reiterated his offer to hold peace talks with North Korea (newsweek.org, 9.5.17) in what he has called his “Sunshine Policy.”
The only real question is whether the United States, and especially its wacky president, will ever agree to stop the war games. Because, after all, we are the Americans, the big dogs, who don’t back down, don’t negotiate unless it’s totally on our terms. Which in this case, means: do what we say not what we do, give up your nukes, and we can discuss the terms of your unconditional surrender. Anything short of that is “begging for war.”

            Addendum, Mar. 10, 2018. Since writing the above, I have read Min Jin Lee’s novel, Pachinko. In it, Lee narrates a saga of a Korean family that, during the pre-WWII period, moves to Osaka, Japan to seek its fortunes there. What we learn is the agonizing plight of Koreans in Japan (and in Korea under Japanese colonization) who are always and everywhere discriminated against as “lazy” and “morally defective,” even when successful and allowed to become Japanese citizens. This helps to explain both the never-healed antagonism between Koreans and Japanese, and the never-outgrown defensiveness of Koreans that pertains to this day—including, perhaps, some of the behavior of Kim Jong Un himself.

Lawrence DiStasi