Friday, June 29, 2018

Just Barely Tolerable

The news of the last week has put me in mind of a concept from Mayahana Buddhism (and indeed from the Judeo-Christian tradition as well) which holds that our earthly world in its present condition is “just barely tolerable” (I first heard this phase in a recent talk by Anbo Stuart Kutchins). The Buddhist term for this is “Saha world,” and it is a key element in the Buddhist understanding of suffering in human life. I will address this below, but for now, I am simply interested in the concept—that the world as we know it is “just barely tolerable.” An Italian writer named Mario Brelich has referred to a similar idea in a kind of novelized essay called The Holy Embrace. There, he writes about how humans, after being ejected by God from the bliss of Eden, had to live in another world that would be “just barely tolerable,” so long as they submitted to the divine will, that is. Whether or not Brelich knew of or had access to Buddhist philosophy is not clear, but since he wrote his book in 1972, it certainly seems possible. No matter. The takeaway here is the notion of our human world as one which is “just barely tolerable.” Anyone who has lived long enough and reflected deeply enough (Nietzsche apparently had a similar idea of his world) must surely agree that the world and life as we know it is almost, but not quite, insufferable. Intolerable. Almost, but not quite. 
            The way I think of this is that, considering the news of last week, those who usually thrive and rule in our world often seem bent on making the world intolerable for everyone else, especially the thinking person, the half-way compassionate person. Take the child abuse inflicted on Central American migrants seeking asylum in the United States. These are people who have suffered untold misery in their home countries (El Salvador, Guatemala, Honduras) from both corrupt authorities (usually enabled and supported by the United States) and out-of-control gangs. Indeed, it has been a truism for centuries that no one abandons home, relatives, neighbors, language and everything else that makes life worth living without enormous pressure either from violence or economic deprivation. So it is with people fleeing gangs in El Salvador or right-wing death squads and narco-traffickers in Guatemala or Honduras. To have even reached the Mexico/U.S. border requires an amazing odyssey that has subjected them to untold abuses. And yet, when they finally reach that dreamed-of border nearly mad with anxiety, these parents are met with the most unimaginable outrage of all: separation from their children. It is an obscenity that is all the more obscene because it is unnecessary: these are not criminals; they are refugees, asylum seekers. They want nothing more than safety for themselves and their young children. And yet, Trump and his Administration, almost gleefully (especially in describing the policy to their rabid supporters), have imposed criminal charges against them, charges of illegal entry for which they must answer in court. And this has necessitated their separation from their children. Now thousands of these children are caged and held separated without any indication of when, or whether they can ever be reunited with their parents, some of whom have already been deported. On its own, this is simply intolerable. 
            Then came the Supreme Court decision of June 26, which upheld the Trump administration’s third try at a travel ban against Muslims trying to immigrate. Through a transparent sleight of hand, the administration cosmeticized the original ban to include two countries that are notmajority Muslim—North Korea and Venezuela—so as to be able to claim innocence where religious animus is concerned, so as to be able to claim national security as its aim. And the conservative majority on the Supreme Court actually fell for this, or rather, has been slowly built over years to approve of such chicanery. And so they did, ruling that the President of the United States has near-unlimited authority to protect national security, though at the same time retroactively condemning the similar presidential order incarcerating Japanese Americans during WWII. It was a bravura performance of hypocrisy that would have pleased the 1857 court that wrote the Dredd-Scott decision. 
            But not satisfied with that, the Supremes followed that decision with two more decisions that put an exclamation mark upon their 2018 season: one reversing a Texas court’s ruling banning obviously racist gerrymandering, thus allowing racism to flourish in our voting system once again; and another granting a government-employed claimant the ‘right’ not to pay union dues, even though he benefits from union actions. This will bankrupt unions of much of the funding they need not only to protect their workers in the future, but also to help fund democratic candidates—the real point of the conservative decision. As if this were not enough, Justice Anthony Kennedy thereupon announced his retirement, thus paving the way for old Hog-Belly to nominate yet another hyper-conservative justice, this time to pollute the court for a generation, and probably dooming Roe v. Wade in the process. The triumph of vulgarity and stupidity and cruelty could not be more complete. Intolerable. 
            But we are assured that our world is “just barely tolerable.” Is there anything, in the face of all this horror, that makes it so? We can all count the ways. There are flowers that bloom, regardless of the hostility emanating from human poisons. There are also vegetables and trees and fish (barely holding on, it is true) and deer and rabbits and bear and mice and foxes (I  just saw one scratching in my yard) and sharks and whales and hawks and coyotes that appear periodically to assure us that nature cannot be so easily suppressed, no matter how much glyphosate or fossil fuel we spray on it and over it and through it. Or how much carbon and plastic we inject into its air and oceans. And we know in our bones that once the human stain is gone from this earth, the natural world will rebound with unalloyed joy. Salmon will again thicken rivers and bears will feast on them as they journey upstream to spawn past rejuvenated forests and meadows and purefied air. Then there are our grandchildren, eager and beautiful and energetic in their innocent anticipation of growing up so they can taste the world that seems so appealing to them if only they could be adult and free. And as we watch, we can only hope that there will still be a world, “barely tolerable” though it might be, for them to fill out as we did. And then there are those courageous types everywhere who refuse to be threatened or deterred, who let their compassion and their fire drive them to relieve the suffering of those at the border or those under the boot or those fleeing global warming or those attempting to find a way to live on the streets. They are there, they are many, and they are models and inspiration for us all. 
            Which brings me back to the start of this essay. The Saha world is a concept made vivid in the branch of Buddhism known as Mahayana. And what it points out is that, though there are said to be other wonderful realms with names like the Pure Land or the Perfume Universe where everything is perfectly lovely (akin to Christian Heaven, perhaps), the Saha world is really best for us humans not just because it’s what we’ve got, but more precisely because of the struggle and hardship we find here. Struggle and hardship are beneficial for us, we are told; or rather, they are beneficial for those who adopt the ideal of the “bodhisattva”—the being who vows to eschew the personal liberation that he or she might obtain in favor of waiting until all beings are liberated. No thank you, says the bodhisattva when his own personal liberation is at hand; I prefer to remain here and help others. For such an ideal being, in short, the struggles and hardships and sufferings endemic to the Saha world are precisely what is needed for her development; development of the great compassion that keeps her here in the thick of things helping all others. In other words, the world just as it is seems perfectly designed for the development of that which the world itself needs to, eventually, wake up. And it becomes plain to such a being that no one, not even the most advanced of beings, can actually wake up alone. No one. Waking up is exactly that which is done together with all beings. 
            So, though in our more desperate moments (like now), we would, if we could, wipe out all the troubles and problems of the world by whatever means necessary to try to bring about some utopia or other, in our more comprehensive, our wiser moments, perhaps, we realize that the “just barely tolerable” world we have, doing its deluded thing as always, provides us with the right combination of horror and solace to keep us honest, and human, and, we hope, compassionate enough to never turn our backs on its “slings and arrows.” Because it is precisely those slings and arrows, along with a little bit of proper nourishment, that make us who and what we are. 

Lawrence DiStasi 

Saturday, June 23, 2018

The Post, the Past, and Now

I recently watched the new Spielberg film, The Post, starring Tom Hanks as Ben Bradlee and Meryl Streep as Katherine Graham. While I wouldn’t say the film was award caliber, it was definitely enjoyable and did its best to engender some suspense over whether Graham, the Post’s owner, would defy the Nixon Administration’s injunction against publication of Daniel Ellsberg’s Pentagon Papers. Anyone who lived through this, as I did, knew that Graham would publish, but the film nonetheless succeeds in building the tension over her decision—which involved not only the possible insolvency of her paper, but possible jail time for her as the owner, and of editor Bradlee and writer Bagdikian as well. What the film mainly recaptured for me, though, was the joy that prevailed, particularly among anti-war activists, when that edition of the Postcontaining the Pentagon Papers piled off the presses; to then be followed by a delirious Tom Hanks dropping on Katherine Graham’s desk copies of a dozen other newspapers across the country that had similarly defied the Attorney General’s injunction against publication. That meant that the strength of all those papers challenging the government and asserting their freedom-of-the-press rights would influence the subsequent Supreme Court decision that agreed with their assertion of those rights. The power of Nixon’s imperial presidency was broken right then and there—which the film makes clear by closing with a night watchman, about a year later, noticing an opened door at Democratic National Headquarters in the Watergate office building. The initial reporting about this break-in by Nixon’s so-called “plumbers” (who had earlier broken into the office of Daniel Ellsberg’s psychiatrist) would lead inexorably to Nixon’s resignation. The Vietnam War would ignominiously end soon thereafter. 
            Given our current dire situation, the film could not help but bring up my hope that something like the miracle of the Pentagon Papers would happen again to bring to ruin the alleged presidency of the Drumpf. Our situation in 1972 was actually similar to what we are experiencing now. Nixon’s grip on the presidency seemed unassailable; despite his 1971 loss regarding the Pentagon Papers, he won a landslide victory over a rather hapless George McGovern in the 1972 election and seemed well on his way to controlling the nation for years to come. But so outraged was he by the Pentagon Papers leak that he set up the “plumbers’ operation,” Watergate became their signature caper, and Nixon was disgraced, humiliated, and forced to resign rather than face inevitable impeachment. Could it happen again? Who knows? But there are forces in motion that could do the same to our wannabe King. 
            We all know what those forces are. To begin with, Donald Trump’s criminality, not just as president but throughout his long career, makes the lawlessness of Nixon’s burglars seem akin to high-school pranks. And it appears that Robert Mueller’s investigation is focused pretty heavily on that lawlessness—the years of real estate scams culminating in the 1990s sale of scores of properties to Russian thugs and oligarchs to launder their money. Paul Manafort, once Trump’s campaign chief, has already been charged with such money laundering, and there are suspicions that Trump Jr. and Jared Kushner have been and still are up to their eyeballs in the same skulduggery. And it is skullduggery. As 18 U.S. code from 1957 says: 

Whoever, in any of the circumstances set forth in subsection (d), knowingly engages or attempts to engage in a monetary transaction in criminally derived property of a value greater than $10,000 and is derived from specified unlawful activity,shall be punished as provided in subsection (b). 

In layman’s terms, laundering money through real estate is a crime—one that involves using criminally-derived money to buy properties and then selling them. Thereby, the income from the sale will have been “laundered” through property to look legal. Even according to Trump Jr., much of the money that poured into the Trump organization since the 1990s to bail out the then-bankrupt Trump came from Russian sources seeking to launder their loot. These, and a lifetime of other shady deals, not excluding presidential violations of the emoluments clause (whereby foreign visitors to Washington DC, for example, are induced to spend huge amounts of money at the Trump Hotel in that city, or in Florida at his Mar-a-Lago property benefiting the president directly), would be used to indict the president himself. 
The problem would be pinning something on Trump or any of his henchmen in today’s ethical climate. Indeed, if we look back to the Nixon-era 1970s, we can see that the issue of a Trump presidency would never have arisen in the first place. No one with Trump’s record of misstatements, corrupt business dealing, outright lying, and sexual outrages could have even been considered for, much less won the nomination of either of the two parties. Yes, Richard Nixon was a crook and a racist, but he covered it with practiced and well-accepted coded language: the ‘southern strategy’ was kept more or less secret from the mainstream, while the ‘moral majority’ did not obviously signify white supremacists but right-thinking Americans in the suburbs. With Trump, though, the racism, the goatish behavior, the contempt for government itself, the scapegoating of the press, the open nepotism with his daughter, and a whole galaxy of attitudes and practices going on quietly to cripple government for decades, are all out in the open. And what is astonishing is not only that Republican politicians seem deaf to these ethical breaches (making impeachment a distant dream), so does what is accurately called his “base” (the people from the basement, the people who are truly basein the worst sense of that word). An incredible instance of that baseness made the news on June 22, when an actual pimp owner of several whorehouses in Nevada named Dennis Hof won the Republican primary for a seat in the Nevada assembly! Explaining his rather surprising support among those highly ‘moral’ Christian evangelicals, Hof, owner of the Moonlite Bunny Ranch, told Reuters, “People will set aside for a moment their moral beliefs, their religious beliefs, to get somebody that is honest in office. Trump is the trailblazer, he is the Christopher Columbus of honest politics” (“In age of Trump, evangelicals back self-styled top U.S. pimp,” by Tim Reid, www.reuters.com, 6.22.18). The Christopher Columbus of honest politics—you just can’t make this stuff up. But it is clearly what Trump’s supporters, including his evangelical ones, seem to think and feel. We heard it repeatedly during the election: “He tells it like it is.” Which is to say, he’ll insult anyone, especially the “elites” that the great unwashed envy and hate, without dressing up his language. That most of what he says is pure fabrication, outright lies, or worse, seems not to matter. That he takes damn fool actions, like the recent ones against asylum-seekers at the border that amount to child abuse, doesn’t matter. That he is a billionaire who is making money that will never reach his impoverished supporters, whose policies will only impoverish them further, doesn’t seem to matter. He is sticking his thumb in the eye of the hated “coastal elites” whom they blame for their troubles, and that’s enough for them. 
And yet, those of us who watch with horror as this happens, can’t help thinking that sooner or later, logic will catch up, truth will catch up, facts and common sense will catch up. And we comb history for parallels (Hitler, Mussolini, Huey Long, etc.) to see if we can understand how this could happen. For me, this comes down to a simple question: how can people believe this known liar? I keep thinking of the song from the 1951 musical (and later film), Royal Wedding: “How Could You Believe Me When I Said I Love you?” the second line of which is, “When you know I’ve been a liar all my life?” That’s it precisely. We all know, they all know that Trump is a serial, a compulsive, a habitual liar. And yet they believe. Why?
There’s a mountain of research on the true roots of belief, but most theories come down to a simple equation: people believe what they want to believe. This is generally called “motivated reasoning.” What that means is that we all have fears, or ideologies, or vested interests, or identity needs that influence what we’re willing to believe and what we’re prone to reject, in today’s term, as “fake news.” So, if we’re fearful about something, we’re much more prone to believe even a wildly improbable promise from a known liar like Donald Trump. If we’re anxious about maintaining our group identity, we’re driven by that to believe what accords with the group, especially if they’re all wildly cheering for the same Donald Trump. And the problem with such belief is that it’s not easily dislodged, least of all by introspection or conscious effort or even opposing facts. All of these motivations are discussed in a May 2017 article by Kristen Weir in Monitor on Psychology, “Why we believe alternative facts.” And what we learn is precisely how powerful motivated reasoning can be, how “our wishes, hopes, fears and motivations often tip the scales to make us more likely to accept something as true if it supports what we want to believe.” This is especially true given the media landscape we all live with these days—with fewer investigative reporters as failing newspapers struggle to maintain even skeleton crews; and with the proliferation of ‘echo chambers’ on the internet that reinforce every wacko bias—all of which makes it ever harder for real facts to win out over wishful thinking. And even when vetted facts can be marshaled for an argument against obvious lies coming from the top, believers can retreat into the accusation of “fake news.” In other words, anything that challenges my beliefs can credibly be attributed to “fake news.” Because after all, that’s what the president says all the time, isn’t it? Anything he doesn’t like is “fake news.” Anything that disputes his truth is “fake news.” And he should know: he’s the all-time champion dispenser of “fake news.” 
There is probably one more (or a million more) element here as well. A 2017 book by Kevin Young looks at the endless variety of bullshitters in American history. It’s titled Bunk: The Rise of Hoaxes, Humbug, Plagiarists, Phonies, Post-Facts, and Fake News, and starts with P.T. Barnum (sideshow mogul born in my hometown of Bridgeport, CT) and ends with Donald Trump. And while there’s much to admire in the book, for me the most salient point is simple: it’s not only that the public, as Barnum proved, is eager to be taken in by hoaxes and ‘bunk,’ there is an element of awareness in its attitude which the bunco artist and con man seem to understand. That is, not only is it the case that we are “motivated” believers (Young says at one point about some fake memorabilia of Hitler, “the forgery is believed not because it alters history, but because we wishhistory were otherwise” p. 276), but also that, on some level, we knowthat we are being taken in, and enjoy or at least need the game. We wantto be fooled, perhaps because the made-up reality is better than the one we are trapped in. Young links this to the “confession” as the dominant form (usually in the form of memoir) in what he calls our ‘Age of Euphemism.’ Here is what he says:

Now the hoax, married to confession, caught in the narrative crisis [where we no longer can tell truth from falsehood, memoir from fiction, ed.], has replaced drink as our national addiction, and substituted loss for feeling lost (439).

In other words, we are, many of us, willing participants, willful believers, junkies ever more addicted to the deception that has become commonplace in our national life. We no longer seem to mind being taken advantage of by internet hucksters, political and financial scammers, bunco artists, and fake presidents. Many of us actually court the hoax, the lie, the gross exaggeration as a kind of entertainment—witness the huge profits being made by media companies that promote scripted ‘reality’ shows and every fart uttered by the Hoaxer-in-Chief.  Of course, these same media companies pretend to be outraged by his “mis-speakings,” but almost all bend over backwards to employ euphemisms rather than calling a lie a lie. They seem reluctant, that is, to kill the goose laying the golden eggs—fake eggs though they may be—that have so many of us mesmerized. 
            And as long as this is so, we are caught, trapped in the need to amuse ourselves, no matter what the cost to our once-vibrant democracy. And it will be considerable.  

Lawrence DiStasi

Thursday, May 24, 2018

Drumpf Backs Out

After weeks of braggadocio from the American “president,” including coy acknowledgments that he might get a Nobel Prize for his agreement to a summit with North Korean leader Kim Jong Un, Drumpf late Wednesday announced publicly that he was withdrawing from the already-agreed-upon meeting in Singapore on June 12. The ostensible reason given by the White House was that North Korea had been lagging in its response to planning meetings, had even canceled one, and was expressing “tremendous anger and open hostility” towards its benefactor, the United States. An unnamed White House official said that “the White House did not want an embarrassing situation of ‘losing the upper hand’” (Washington Post, “Trump Cancels Nuclear Summit” 5/24/18.) This seems like precisely what caused the “great deal-maker” to cancel: he and others in his war-mongers circle felt clearly that Kim Jong Un’s tough language in recent days was giving the North Korean leader the “upper hand.” Why? Because if Kim canceled first, then that would leave the “deal-maker” like a bride left at the altar. And that is intolerable. This is why Trump and his spokespeople all along, but especially in recent days, have kept repeating their line that a deal would be “wonderful and historic and enriching” for North Korea, but if it didn’t happen, well that was ok too. This seems to be a pattern for Trump: this will be the greatest event in world history, but it doesn’t really matter to me if it happens or not. Reminds one of nothing so much as teenagers maneuvering for a date. “I don’t care.” “Well I don’t care either.” “Nyah nyah!” This is what passes for diplomacy in the age of Trump. 
            What also passes, and it has characterized previous administrations as well, is the notion that when the U.S. agrees to deal with a ‘lesser’ nation (and all nations are lesser in the eyes of most Americans), why that agreement to even meet should be seen by the little dog as the bestowal of unmerited grace. That is the supine posture the North Korean leader has seemed unwilling to adopt. Instead, he has had the temerity to object to the ham-handed and openly insulting behavior of attack dogs like John Bolton (now National Security Adviser), who recently proclaimed that the model for the upcoming summit should be Libya and its abandonment of its nuclear program. As if that weren’t clear enough, that other ham hand, VP Mike Pence, made the same reference to Libya’s downfall in a Monday television interview. Are these people crazy? What do they expect from North Korea when their model is openly stated to be the surrender by Moammar Gaddafi of his nuclear arsenal, followed shortly by his murder and ugly violation in 2011 by the Western nations that invaded him, as his reward? Is this supposed to encourage Kim Jong Un to follow in the murdered leader’s footsteps? It clearly did not; one of Kim’s aides, in fact, publicly labeled the Vice President a “political dummy” for his remarks. That same appellation could be applied to the entire Trump Administration, from top to bottom. They took a golden opportunity handed to them on a silver platter by South Korean President Moon Jae-in, and turned it into a fiasco. As is his wont, Trump thought he could just “wing it,” dispensing with most of the diplomacy that normally precedes and prepares the ground for a huge summit between leaders, casually implying that when Kim Jong Un entered the room with the great American deal-maker, he’d be so awed he’d agree to anything. Especially the rape of his own power. The dismantling of the nuclear capability he’d spent so much treasure to develop. The very nuclear power that had led to the proposed summit (contrasted with the denuclearization that dropped both Gaddafi and Saddam Hussein into the grave), and that now his adversaries in the United States were assuming he’d simply toss into the garbage. You’ve got to be pretty myopic to believe in your own infallibility to that extent. 
            But of course, that is what American leaders, especially Trump and his gang of merry bunglers, do believe. They think they can simply wave America’s military might (which they have been doing at this very time when North Korea is looking for signs of sincerity, carrying on the hated war games with South Korea that Kim Jong Un wants most of all to end), and all will be solved instantly. What they have now found out, or at least got an inkling of, is that the world has changed. Even an impoverished nation of 24 million like North Korea is not willing to be bullied into giving up the only security it has for the vacant promises of the great hegemon. Because the great hegemon has been demonstrating, even in recent weeks, that its word to “little dogs” is worthless. Less than worthless. Trump’s recent withdrawal from the Iran nuclear deal is another case in point. Worked out over years, the flame-haired wonder trashed that in a few minutes—at the very time he and his administration were trying to persuade North Korea that the word of this very U.S. leader, in any upcoming nuclear deal, would be honored. Again, are these people crazy? Do they think everyone is as dense as they are? As the idiot Americans who voted for them? Apparently, they do. 
            So they’re still expressing optimism in an eventual summit one of these days. When North Korea will come to its senses and give up its nuclear weapons entirely. Because that’s what “denuclearization” has meant to these ahistorical idiots. 'Our tough stance has worked. Kim Jong Un has blinked. He’s willing to give up his weapons, for nothing in return.' 
            Nope. As I’ve written in a previous post (see “Kim, the Trumpster, and Denuclearization” Mar. 10, 2018), if “denuclearization” means anything to the North Korean leader, it means ending the nuclear threat from the United States and South Korea at the same time as the North ends its threat. Maybe. Many observers think that even that is a long shot. Think that Kim would never give up what he’s won at such great cost. But of course, observers don’t have the overweening arrogance of the Drumpster. Which is really the problem here, and everywhere. The overweening arrogance of the Empire. Which, as history has shown again and again, is the prelude to its downfall. But don’t tell the great flame-thrower. He might cry; or cry havoc. And bring us all down with him. 

Lawrence DiStasi

Tuesday, May 1, 2018

That Old Devil Sugar

It is surely no secret that the United States, and, increasingly, the rest of the world, is wallowing in an epidemic of diabetes (the CDC says that one of every four American teenagers has type 2 diabetes, with the expectation that by 2040, it will one in three.Teenagers!), and that many doctors and other healthcare providers consider sugar to be at least one culprit. It is therefore somewhat surprising that Gary Taubes, in his latest book, The Case Against Sugar(Knopf: 2016), takes the position that there is a serious debate, if not a war, raging about this, and that he must therefore prosecute sugar as responsible for just about every major ill we have, including cancer and Alzheimer’s. For my purposes, though, I’d just like to see agreement that sugar is clearly responsible for a few: diabetes, metabolic syndrome, heart disease, and hypertension (high blood pressure). That’s because I have been diagnosed late in life with two of those at least: type 2 diabetes, and high blood pressure. I have both more or less under control, but anything that can bring clarity to what causes either or both is welcome. 
            For Taubes, the case is crystal clear. The culprit is sugar, especially the refined sugar called sucrose (half glucose, half fructose) andthe ingredient that seems to be in just about every prepared food one buys these days, high-fructose corn syrup(55% fructose, 45% glucose). These are the empty calories that Americans and inhabitants of all advanced industrial countries have been consuming in ever greater amounts for a few hundred years (Americans in 1999 were consuming an average of 158 pounds of sugar per year). And according to several studies that Taubes cites, it is this major dietary change that accounts for the damage wrought to our bodies. Humans have simply not had time to evolve fast enough for our bodies and organs (particularly the liver and pancreas) to handle the huge increase in refined sugars we now consume. This evolutionary argument is one of the best Taubes makes. For example, he notes that yearly per capita consumption of sugar “more than quadrupled in England in the eighteenth century, from four pounds to eighteen pounds, and then more than quadrupled again in the nineteenth,” while in the United States “sugar consumption increased sixteen-fold over that same century” (42). But the real argument comes in studies done more recently on indigenous populations like American Indians and Africans who have changed to Western-style living more recently and more rapidly. Among the Indians of Arizona, for example, the Pima, along with other Native Americans, have seen diabetes rates explode from almost nothing when eating their native diet to over 50% and more now, after changing to a Western diet laced with sugar and immense quantities of soda. Another major study, Western Diseases(1981) by Denis Burkitt and Hugh Trowell on indigenous populations in Africa, saw tooth decay, gout, obesity, diabetes, and hypertension skyrocket with Westernization (229). Another study of natives from Tokenau, an island near New Zealand, saw the same phenomenon: from a diet of coconut, fish, pork, chicken, and breadfruit (a high-fat diet, Taubes notes), the Tokenau people shifted to western ways in the 1970s. Some migrated to the big island of New Zealand, while some simply stayed on Tokenau, but both adopted western ways and foods. Diabetes shot up to engulf almost 20% of the women and 11% of the men, with corresponding increases in hypertension, heart disease, and obesity. To sum up, Taubes quotes one of his heroes, John Yudkin, a University of London nutritionist, who wrote in 1963: “We now eat in two weeks the amount of sugar our ancestors of two-hundred years ago ate in a whole year” (154). 
            But many of us have heard these stories and statistics before. What is surprising and even shocking in Taubes’s arguments are the supportive and historical facts. As one might guess from his argumentative title, not everyone agrees with Taubes. In fact, for the major arbiters of American eating policy like the FDA and the NIH (National Institutes of Health), sugar has remained a kind of untouchable, on the list of GRAS (generally recognized as safe) foods until virtually the present day (a quick Google search revealed that the latest FDA recommendations concerning sugar only warn that sugar can cause tooth decay!). This contrasts with FDA decisions on the sugar substitutes saccharin and cyclamates, which, with urging from scientists sponsored by the sugar industry, have been listed as possibly carcinogenic. But the major thrust of the scientists like Ancel Keys of the University of Minnesota and Fred Stare of Harvard (both funded handsomely by the sugar industry, Taubes points out) has been twofold: first, that it is the fats in our diets that kill us prematurely in the West, via heart disease and diabetes; and second, that “we get obese or overweight because we take in more calories than we expend or excrete” (107-9). Taubes calls this “the gift that keeps on giving,” because it essentially exonerates sugar from any role in obesity or diabetes or the host of other diseases plaguing Western societies. It doesn’t make any difference what you eat, goes this mantra, because “a calorie is a calorie.” Eat too much food (too many calories), and exercise too little, and you get fat, which causes you to get diabetes and die early. Period. End of discussion. Forget about the impact of refined foods (white flour, white sugar) on one’s metabolism. 
            Taubes goes into most of the studies and white papers that chart this, to him, massive fraud, but it’s not necessary to repeat that here. Suffice it to say that the sugar industry has used its massive profits and political clout to pretty much cloud the issue of sugar’s deadly effects in much the same way the tobacco industry clouded, for years, the dangers of tobacco smoke. Indeed, one of the really surprising roles of the sugar devil is in flavoring tobacco. Yes, that’s right. Tobacco growers and cigarette makers learned around the turn of the twentieth century that tobacco in cigars and pipes wasn’t really selling enough product (getting enough people hooked). So, in 1914, R.J. Reynolds introduced Camels, “the first brand of cigarettes made of multiple tobacco types (basically flue-cured Virginia, and Burley tobacco) blended together” (64). Sugar entered in two ways. First, flue-curing the Virginia tobacco turned a natural sugar content of about 3% to one of 22% sugar. That higher sugar content makes the tobacco more inhalable, because the smoke becomes acidic, not alkaline (alkaline smoke irritates the mucous membranes and stimulates coughing, which is why pipe smokers rarely inhale). Second, the nicotine-rich Burley tobacco was “sauced” with sugars from honey, molasses, licorice and so on. This was first done for chewing tobacco, but Reynolds put it in Camels, and this really did the trick. Why? Because by blending “sauced” Burley tobacco with already-sweetened flue-cured Virginia, Camels were able to deliver a sweet-tasting and -smelling cigarette that was easier to inhale, and thus “maximized the delivery of nicotine—and carcinogens—to the human lungs” (69). Most other tobacco companies and brands wasted no time following Camels to the point that “by 1929, U.S. tobacco growers were saucing Burley tobacco with 50 million pounds of sugar a year” (69). That this is not fiction is evidenced by Taubes’s quoting of a 2006 report from the Netherlands: “Consumer acceptance of cigarette mainstream smoke [what’s directly inhaled] is proportional to the sugar level of the tobacco” (70). In other words, sugar’s key role in making cigarettes palatable (inhalable) to both men and women worldwide contributed in a major way to the epidemic of lung cancers that are still with us. 
            Taubes has endless data on the role of sugar in diseases, but a couple stand out. First, he cites studies showing that sugar may well be addictive (as anyone with children knows). It elicits a response in the brain’s “reward center,” e.g. the nucleus accumbens, that closely resembles the response from nicotine, cocaine, heroin, and alcohol. This may help explain why its rise has been so spectacular in every society where it was introduced. More important, to me at least, is the connection Taubes tries to make between sugar and what is now generally conceded to be a key factor in diabetes and a host of other diseases that cluster with it—insulin resistance. Insulin, of course, is the hormone produced in the pancreas that acts to combat high blood sugar. Basically, when blood sugar (glucose) levels rise, the pancreas responds by secreting insulin, which “signals the muscle cells to take up and burn more glucose.” Insulin also induces cells to store some of the glucose as fat (insulin is said to be “lipogenic” or fat-forming). Then, when blood glucose falls, the insulin level falls too, and the stored fat can be burned instead of glucose. The problem is that some people (and increasing numbers in Western societies) exhibit a condition known as “insulin resistance”: their cells do not accept the insulin, and hence the glucose levels in the blood, and the levels of insulin, remain high or get higher. 
The question is: what causes insulin resistance in the first place? 
This would appear to be the 64-million-dollar question (the fight over this is quite active even today, with one side blaming the “drop in insulin sensitivity” on a fat, “intramyocellular lipid” said to block cell receptors from accepting insulin; and another blaming insulin resistance on excess carbohydrate [especially sugar] consumption and too much insulin production). Oddly, Taubes himself does not give a conclusive answer. He strongly suggests that insulin resistance is caused by too much sugar consumption, but he hedges. He says that in contrast to the sugar industry’s old mantra that insulin resistance and diabetes are caused by obesity, i.e. eating too much or consuming too many calories, another possibility 

is that these elevated levels of insulin and the insulin resistance itself were caused by the carbohydrate content of our diets, and perhaps sugar in particular. Insulin is secreted in response to rising blood sugar, and rising blood sugar is a response to a carbohydrate-rich meal. That somehow this system could be dysregulated such that too much insulin was being secreted and that this was causing excessive lipogenesis—fat formation—was a simple hypothesis to explain a simple observation. (120-1). 

But because insulin resistance is so important as a causal factor in diabetes, obesity, metabolic syndrome, heart disease, and so on, we would like something firmer. Taubes doesn’t give it to us, partly, he explains, because the NIH decided long ago not to fund long-term studies on the effects of sugar in the American diet because it would cost too much. It would also take a long time, since diabetes does not manifest overnight; like tobacco, sugar’s effects usually take years to manifest. But still, we want more. And Taubes does cite one study from Switzerland that seems to provide some scientific proof. Luc Tappy of the University of Lausanne studied fructose (fructose is metabolized without the need of insulin) in the mid-1980s. What his study found is crucial, even though it was short-term:
When Tappy fed his human subjects the equivalent of the fructose in 8 to 10 cans of Coke or Pepsi a day—a “pretty high dose” as he says—their livers would start to become insulin-resistant and their triglycerides would elevate in just a few days. With lower doses, the same effects would appear but only if the experiment ran for a month or more (205). 

This would seem to be a sound study, indicating insulin resistance and elevated triglycerides, both key markers for diabetes and for heart disease, from excess sugar. But again, its effects derived only from the use of fructose—though that is the main ingredient in most sugared drinks and lots more, via that demon invention, high-fructose corn syrup. 
            Unfortunately, this is as much as Taubes provides concerning insulin resistance. And that is a shame. 
            Still, Taubes’s book is well worth reading, if only for the disgraceful history it unfolds, a history of sugar that has scarred this nation from its very beginnings (slavery and sugar are intimately intertwined) and which still, in 2018, thanks to the power of the sugar industry in shielding its product from blame, continues its devilish work. 

Lawrence DiStasi

Tuesday, April 24, 2018

Opioid Crisis

I probably wouldn’t have read Dreamlandif my daughter and son-in-law hadn’t given it to me when we had Easter dinner. After all, there has been nonstop coverage of the overdose deaths plaguing white America for several years, and I pretty much thought I knew it well enough—well enough to be disgusted that the attitude of official America had softened dramatically from the “get tough on drugs” approach that had locked up millions of urban African Americans to the “we need understanding and medical treatment” that had prevailed when the addict ghetto suddenly became white and suburban. But they did and I did, and now I think everyone else should—read the prizewinning 2015 book by Sam Quinones titled Dreamland, that is. It really is a gripping and infuriating tale of malfeasance, first by a medical community that adopted prescription opiates for pain with missionary-like blinders; then by the health care establishment that refused to pay for the full treatment for pain (psychologists, social workers etc.) that pain clinics recommended but only for pills; then by the pill-pushing pharmaceutical companies that cashed in on the bonanza without a care for the horror they were perpetrating but only for their profit; and finally by an American public whom success had schooled to expect quick technological fixes, especially for pain, any pain at all which, they thought, every American had a right to not ever suffer. All this, and more, is part of the true tale that Sam Quinones tells. It is essentially a tale with two major strands: in one, unscrupulous doctors operating “pill mills” write thousands of opioid prescriptions for people with nothing but a claim of pain (back pain, knee pain, any kind of pain), thus creating millions of addicts in suburban towns where none had existed before; and in the other, young men from a small area in western Mexico called Nayarit discover that the black tar heroin they can make from poppies nearby can be peddled in an entirely new, safe, and convenient way in small cities and suburbs in America.  
            Quinones begins his tale with the story from which his title comes, Dreamland. It was the name of a community pool in Portsmouth, an idyllic center of safety and pleasure for a suburban town in southern Ohio that used to be a main U.S. manufacturing center for shoes and shoelaces. Like other small Midwest factory towns, though, outsourcing to foreign countries with cheap labor killed the manufacturing, and killed Dreamland along with it. And it was in Portsmouth Ohio, along with nearby towns in Kentucky and West Virginia, that the opiate epidemic found a dispirited and dead-end population eager to be seduced out of its existential pain. 
            To understand how this could happen, Quinones takes us through the growth of the American pain revolution. He starts with the discovery of heroin in Germany in 1898 (the drug was actually first synthesized in London in 1874 as diacetylmorphine, but the Bayer chemist named Dreser who duplicated it in Germany called it heroin—from heroisch, German for “heroic.”) Though heroin was first believed to be non-addictive, regulators soon learned better, and its use in the United States was strictly controlled starting in 1914. Still, there continued to be people, particularly in big cities, who got hooked, and one of the treatments for such addicts was a substitute drug called methadone—with methadone clinics maintaining addicts in cities all across the country. The problem with these clinics was that they tried to limit addicts to doses of methadone that were too small, so addicts had to find their dope fix elsewhere. Now shift to the Nayarit-based (the municipality was Xalisco) dealers of black tar heroin, a form that is much less purified than regular street heroin, but also far more powerful—delivering 80 percent heroin compared to the usual street version at 10 or 15 percent. As Quinones puts it, the Xalisco Boys “discovered that methadone clinics were, in effect, game preserves” which provided one of their first footholds in American cities. 
But then the pain revolution happened. It started in the 80s with well-meaning doctors seeking to provide mostly terminal cancer patients with pain relief. They called it ‘palliative care,’ essentially figuring that risking addiction ranked low on the ladder for someone who was dying soon anyway. This work was ‘helped’ when, in 1984, Purdue Frederick Pharmaceuticals came out with MS Contin, a timed-release morphine pill that would soon become OxyContin. Fitting perfectly with pain-management, the new pills “seemed far less addictive because through their timed-release formulas they eased relief out to the patient over many hours” (85); that is, addicts couldn’t get the big “rush” from a normal morphine hit. At about the same time, pain management organizations promoted the idea that “America was undertreating pain,” all kinds of pain, any kind of pain. State lawmakers followed suit, passing laws “exempting doctors from prosecution if they prescribed opiates for pain.” Things then grew like Topsy, with patients soon getting used to demanding drugs for pain (my own doctor in Pt Reyes, CA told me that to this day, even with all that is known about the opioid epidemic, his clinic sees patients literally demandingpain drugs from doctors); and, more important, doctors becoming convinced that these morphine-based painkillers were virtually non-addictivewhen used to treat pain. This latter idea stemmed from an alleged “report” in the New England Journal of Medicine by a doctor named Hershel Jick, which claimed that “less than one percent of patients treated with narcotics developed addictions to them.” Drug companies like Purdue producing timed-release opiates seized on this alleged “landmark report,” turned it into a “scientific study,” and promoted their pills as non-addictive, with most doctors quick to believe them. In truth, of course, the Porter & Jick “study” was merely a “one-paragraph letter to the editor” that commented specifically on the use of opiates with cancer patients under strict hospital supervisionwhere the danger of over-use was almost non-existent. But no one bothered to look up the research (there was none), and Purdue Pharma aggressively promoted the use of OxyContin for all kinds of pain relief, at the same time ridiculing the fear of patient addiction as “basically unwarranted.” 
Now what you had in places like Portsmouth, OH were hordes of people looking for relief from their pain (from back pain to sports pain to post-operative pain to existential pain), and just as suddenly, scores of unscrupulous doctors setting themselves up in pain clinics to dispense prescriptions for OxyContin to anyone claiming to be in pain. Any kind of pain. All pain. The result was long lines outside these pain clinics, and beyond that, many now-unemployed people throughout Appalachia signing up for SSI (Supplemental Security Income), because with it they could get a Medicaid card that paid for their pain prescriptions. This Medicaid scam actually became a business for many; they would get their prescriptions filled for a $3 copay and sell the pills to those who had run out for up to $10,000 on the street. Some bright entrepreneurs actually took to driving carloads of pill seekers long distances to cheaper pill mills in Florida and elsewhere—anywhere that pills could be procured more cheaply—for a nice profit. By 2002, according to Quinones, OxyContin prescriptions for pain rose from 670,000 in 1997 to 6.2 million five years later. And of course, contrary to the claims by Purdue Pharma that the pills couldn’t be misused, addicts soon learned to crush the Oxy pill and snort it, or even liquefy it and inject it, thus “obtaining all twelve hours’ worth of oxycodone at once” (138). 
The stage was now set for the Xalisco boys to expand their range to those towns outside the major urban centers, especially in the middle of the country, that had a growing supply of opiate users. And they did, moving operators to Portland, Cincinnnati, Las Vegas, Portsmouth, Charlotte, Nashville, and everywhere in between. Their appeal was, first, that their prices for their black tar heroin were so much cheaper than prescription pills (which could cost as much as $30 a pill). As Quinones puts it, 
by 2011, you could buy fifteen balloons of potent Xalisco black tar heroin for a hundred dollars— $6.50 a dose, about the price of a pack of Marlboros—in Charlotte, NC, a town that barely knew the drug a few years before (229).

The second appeal was the delivery system. The Xalisco boys had early discovered that it was easy, in the suburbs, to set up personal delivery to addicts. All an addict had to do was call a cell phone number; the driver would deliver it in minutes to an agreed-upon place like a mall parking lot. And since the Mexican drivers were trained not to carry either weapons or excess dope but only five or six balloons with pellets of heroin inside that they carried in their mouths, they were almost never caught carrying heroin. If police came, they would simply swallow the balloons. Even when caught, their punishment was usually simple deportation, and another driver would soon replace them. The third appeal, of course, was the potency of their product. As noted above, black tar heroin was up to 80% pure. This was a boon at first, but then it had its negative effect: overdoses by addicts who were used to a far-less-potent hit. 
            It was the overdose deaths that began to alarm officials throughout the nation. Two government workers in Olympia WA, Jamie Mai and Dr. Gary Franklin, began to investigate the overdose deaths in their state and found they were increasing by alarming rates, and in tandem with the amount of opiates being prescribed, or rather “overprescribed.” When they published their results recommending strict prescribing limits for opiates, though, they were not greeted as heroes; rather Mai and Franklin were sued by a doctor named Merle Janes alleging that their opiate guidelines exhibited “an extreme anti-opioid discriminatory animus or zealotry known as Opiophobia” (234) that would corrupt the management of public health policy. Fortunately, Dr. Janes’s lawsuit was thrown out of court in 2011, and Washington state issued its guidelines, soon followed by several other states. But it had taken a long time, perhaps too long. In 2005, for example, an epidemiologist in the Ohio Department of Health noticed that not only were poisoning deaths in Ohio increasing alarmingly, but that most of them were drug overdoses. By 2007, further analysis revealed that drug overdose deaths in Ohio were about to, and did in 2008, “surpass fatal auto crashes as Ohio’s top cause of injury death” (249). This, again, corresponded with the amount of Oxycodone dispensed, which rose by almost 1,000 percent in Ohio between 1999 and 2008. Put another way, “the number of Ohioans who died from overdoses between 2003 and 2008 was 50% higher than the number of U.S. soldiers who died in the entire Iraq War” (250). 
            There is much more detail in this astonishing and infuriating book, but to bring it up to date, here are some facts about the state of the opioid crisis in the United States as of 2018. The U.S. is still, by far, “the leading prescriber of opioids in the world” (this and all subsequent quotes from Vox.com, 3.6.18, Ella Nielsen, “America’s opioid crisis has become an ‘epidemic of epidemics’”). Driven by the drugs named above, plus the powerful heroin synthetic, fentanyl, “drug overdoses claimed 64,000 lives in 2016 alone, more than the entire death toll during the Vietnam War.” And finally, because of intravenous drug use, “there are 40,000 to 50,000 new cases of bacterial endocarditis (infection from re-using unclean needles) in the U.S. each year” which cost “more than $120,000” apiece to treat. When added to the new cases of Hepatitis C from the same source, this amounts to what researchers now call a “syndemic,” or “multiple diseases feeding off one another.” The two Washington State researchers, Jaymie Mai and Gary Franklin put this another way, calling the opioid crisis “the worst man-made epidemic in history, made by organized medicine” (Quinones, 310). 
            One has to agree with this, and with Quinones, when he underscores Martin Adler’s (professor of pharmacology at Temple U) take on morphine as “a great metaphor for life.” Here’s how Adler puts it, referring to the perversion that prevailed for a time in the idea that pain was somehow to be exiled from human experience:

“The bad effects of morphine act to minimize the use of the drug, which is a good thing. There are people born without pain receptors. [Living without pain] is a horrible thing. They die young because pain is the greatest signaling mechanism we have” (313). 

To attempt to take away that signaling mechanism, as American medicine tried to do, was more than a terrible blunder; it was a fundamental misunderstanding of life and the human organism— how it works and how it is designed to work. 

Lawrence DiStasi


Tuesday, April 17, 2018

Dear Facebook

Like everyone else, I followed the highlighted testimony of Mark “the zombie-shark” Zuckerberg before Congress. And what I have decided is that there is simply no way these creeps are going to desist from their surveillance of me and billions of others—because it’s their business. Facebook followers like myself sign up so we can see the pictures and doings of our relatives, mostly the kids, and in doing so unintentionally allow the Facebook creeps to harvest all our information and preferences so they can sell it to advertisers who want to target their ads to us. In short, the business isn’t about connecting people to each other; it’s not a “service” as Zuck likes to say; it’s about spying on everyone everywhere to get their information and preferences, so the data can be sold for billions to the greedy corporate bastards who want to precisely target customers (and apparently to Russkies who want to target potential voters). If they know I’m interested in cars, they can send me ads for new cars. If they know I’m interested in underwear, they can send me ads for underwear. I’ve had this happen recently: I looked on a couple of websites for packages of underwear to redeem a gift card, and haven’t stopped getting underwear ads on every site I go to. At one point recently, I was pissed off at a rise in my car insurance rates, and so checked out a site that promised cheap car insurance for low mileage drivers. I’ve since been besieged daily with teasers for “Ford owner in Bolinas” trying to lure me to check out a great insurance rate (which turns out the same each time: they feed me to some company who asks for more information which feeds me to someone else for the same routine, but never, ever do I get an actual rate quote). 
            So here’s what I’ve decided. Ok. You’ve got my browsing records. You’ve got my “likes” and the articles I’ve seen posted by others on my Facebook feed that I’ve re-posted. You’ve seen my blogs that I’ve posted, and articles I think are worth posting, so you know what I like, or think you do. So to save you time (with the saved time, you’ll have ample time to shove my preferences up your blowhole), I’ve decided to put it all out there and let you pursue me with your best shot.
            --I need underwear every once in a while. I like boxer briefs.
            --I need socks periodically, though my sister bought me enough one recent Christmas to last me for years. 
            --I need gasoline for the little driving I do, but you can’t sell that over the internet.
            --I read political articles on some sites you may not like much: Truthout, Common Dreams, Consortium News, Counterpunch, Reader Supported News, Nation of Change, Daily Kos, and others of that lefty ilk. I also read anything I can get my hands on that shows why this nation is going into the toilet, and how we might hasten its demise, especially its current economic system, to bring about a more equitable distribution of wealth, health and happiness. That includes revolutionary change of the kind that doesn’t involve buying lots of useless shit from big box stores, or going out to stupid restaurants that prepare tasteless, nutritionally disastrous meals; quite the contrary: it involves, quite precisely, refraining from the great American pastime of buying dumb shit or wasting time on websites that allow us to observe others posting about the dumb shit they’ve just bought or eaten, and instead procuring only what one needs, when one needs it. Taking pride, in fact, in not buying useless products that only help destroy our planet. 
            --I have decided that keeping up with the news from my relatives may make it worthwhile to maintain my Facebook account, at least for a time, but know that you can spy all you want and target me with dumb commercials all you like, but I’m not about to buy a damn thing. Not a damn thing.  
            So spy away, Zuckerberg and company. These eyes refuse to buy or even notice what you’re selling, nor do I have even the remotest intention of getting a smartphone or any apps that make it easier to tune in and buy said dumb shit. All I might buy are books, and I don’t like reading books on screen in any case; I like the old-fashioned paper kind you can cozy up with, which is also what I like to write. 
            In closing, let me summarize what I really mean to say, by saying to Facebook what my Great Aunt Zi’Carmela would’ve said: a’ fottuta

Lawrence DiStasi

Friday, March 30, 2018

Good Friday

I woke this morning rather early, for my second pee, with dawn’s light just beginning to seep in from the east, when, glimpsing through my bathroom window to the west, I saw something I don’t remember seeing before: a full moon, already about a third obscured, setting in the west. It seemed uncanny to me. I had never seen a moon-set before, at least not that I remember. I’ve seen a quarter or half-moon in the western sky high up, while the sun rose in the east, but never this—a full moon looking almost like the sun setting, only it was this bright, glowing moon. Later, I saw an item in my Bing news feed: that this was a special moon in another sense; it was a “blue moon,” the second full moon in a single calendar month. And, a bit later, that it was also a “paschal” moon—the moon around which Passover, and therefore the Easter that always follows the Jewish holiday (since Jesus was a Jew and celebrated his Last Supper in honor of Passover), also revolves. 
            Of course, it’s not Easter yet. It’s Good Friday. And that name also gives me pause. I mean in what sense is the gruesome crucifixion that Good Friday celebrates “good?” Why do Christians celebrate (as many adherents of other religions often ask) the brutal torture and killing of their God? Why have they made it the central symbol of their religion? Why do religious people wear crosses or even crucifixes (metal images of their dead God hung from the cross) around their necks? 
            I haven’t thought about these things for many years, mainly because I have been practicing Buddhism since my thirties and Christianity has seemed mostly irrelevant. But of course, like many others, I still mark Christian holidays by attending special family dinners, featuring traditional meals that I love. And periodically I write a blog about one of the holidays and the philosophy behind it. As I’m doing here. What is this Good Friday thing? Does it depend on knowing that the dead God, Jesus, will rise again on Easter Sunday (implying that we, too, will survive our death)? Is that why the Friday of his death is “good?” Or is the Friday of the God-slaying “good” even aside from the joy of the resurrection? 
            If I remember correctly, there is something “good” about the death of Jesus, about his sacrifice, because he is thought to be redeeming all Christians (and perhaps all of humanity) thereby. His death is an atonement, a sacrifice to his Father, the big Macher, for the “sins” of humanity. And because he is believed to be no less than God himself, this sacrifice, this death of God himself, is sufficient to make up for, to atone for the horrible sin(s) of humans. And that, of course, raises the key question: what have we humans done that we require such an awful atonement? 
            If we follow orthodoxy and the Bible, we would have to say that humans, in the persons of Adam and Eve, sinned against God by eating of the forbidden fruit. As Milton put it in Paradise Lost: “Of Mans First Disobedience, and the Fruit/ Of that Forbidden Tree, whose mortal taste/ Brought Death into the World, and all our woe…” So there it is: this flouting of God’s command in Paradise set the stage for the coming of Jesus to perform his sacrifice and redeem us all from what Roman Catholics call “original sin.”  But clearly, we’re not just talking about an apple here (or even a pomegranate). The Bible is referring to something more critical, more outrageous. It is referring to some deep transgression, a crime so unforgiveable that only the death of God’s son could atone for it.What could that be? 
            Given the symbolism, we might reason, it had to do with eating—since eating the fruit is what causes the problem in the first place. And this reminds me of the first novel I ever wrote, still unpublished, called Eat. It concerns a character who decides that human life disgusts him so much that he will stop eating. Stop eating altogether. Not a fast for some political reason, but simply as a sign of his disgust with the human situation. It was based on some fasting done by a student of mine when I was teaching at a small college in Gettysburg Pennsylvania, and I found it so fascinating, especially for an Italian, that I turned it into a novel. The point here is that the act of eating itself, no matter how we try to sanitize it, brings with it all kinds of ethical problems. If we eat meat, we are essentially eating the dead bodies of animals. And that requires that we take life to sustain life, that we kill something living to survive. And even if we limit our consumption to vegetable products, we are still taking vegetal life from plants that would prefer to proliferate on their own. There is simply no getting away from it: Life lives on other life. Living things survive by killing and consuming other living things. And so, if one wanted to live a perfectly blameless life, one would have to find some way to survive on air, or light, like the angels. This is the dream behind religion, not to mention the veganism, so popular in our time: there must be a way to survive without killing other beings. 
            If we were to expand this further, we could easily see that our current predicament, in this twenty-first century of the Christian era, stems from related problems. Humans have turned out to be the most lethal creatures ever to roam this earth. We have, anthropologists tell us, been directly responsible for the extinction of countless species—for example, several species of large mammals that once roamed the North American continent. Currently, our food production practices (what we do to raise cattle, pigs, fowl, and even fish on farms) are causing endless mayhem to the natural world. To provide grain for these animals, we have increasingly destroyed forests that are the literal lungs of our planet. The excrement from our farm animals is a major source of pollution, including methane, one of the most potent of greenhouse gases. And our industrial fishing of the oceans has depleted fish stocks to nearly the point of no return. The industry and population explosion made possible by this food system has led to global warming and what is turning out to be the greatest mass extinction of other species in history. 
            So, yes, our eating practices, our virtually infinite appetite for flesh to sustain us, could well be said to constitute our original sin. The flaw in the very biological construct that we are. But isn’t it the case that we are simply following the pattern laid down by evolution itself? Aren’t we simply doing what all living creatures do, and must do—consume each other? We are. And here we come to the real flaw. The root flaw, as I reflect on it, is our inability, our refusal to accept who and what we are. Our determination to make of ourselves, the species homo sapiens, something special. Something out of the ordinary. Something separate and unequal. Something that can somehow, no matter the cost, circumvent the necessity of our being. And so we invent gods, and sons of gods, who come into existence to save us from ourselves. Save us from being human. Jesus on the cross redeems us—and thereby makes it ok to do what is repulsive—consume the flesh of other beings. Then and thereby, we become humans who are above and beyond all other creatures. We are the special creature, the one that god saved by sending us his only son to die for us. And we can comfortably go on despoiling the earth and its creatures because we have been given that dispensation. We have been given Good Friday. We have been given redemption. We have, essentially, been given the earth to do with, to despoil as we will. 
            I think it is clear what the problem is here. What the “good” in Good Friday means to signify. And what we must all do, somehow, to move beyond our root refusal to accept our aliveness, must involve our human-ness and the responsibility that comes with it. The responsibility, I mean, to find a way to embrace our kinship, our continuation with all of creation that can keep us from consciously or unconsciously destroying it. An ability to somehow consume what we must consume with some sort of reverence. Some sort of restraint. Some sort of appreciation. Some species of willingness to allow all else to do the same in the precious and precarious balance all have been given. 
Even when it means, as it inevitably must, that we ourselves are the consumed. 

Lawrence DiStasi

Saturday, March 10, 2018

Kim, the Trumpster and Denuclearization

World news outlets were rocked this week by Donald Trump’s alleged agreement (the news was actually broken, rather strangely, by the South Korean delegation that brought the North Korean’s offer to the White House) to hold a meeting with North Korea’s leader, Kim Jong Un. Given Trump’s belligerent rhetoric of only a few months ago in which he threatened to rain “fire and fury” upon the North Koreans, it was surprising to say the least. But perhaps it was really not so surprising after all. For Trump clearly sees in this nuclear summit the prospect of great headlines throughout the world—and if there is one thing this global narcissist lives for, it’s headlines announcing a “first” for him (not to mention getting other less savory news off the front pages). Whether he’ll be able to pull something real out his toupe´ is something else again. For as usual, the headlines mistake what Kim Jong Un has offered. He did NOT say he was willing to give up his nukes. He said he is willing to discuss the “denuclearization of the peninsula.”  What does that mean? Not exactly clear. But one thing is certain: Kim Jong Un, like his father and grandfather before him, wants to get out from under the American threat, both nuclear and otherwise. He wants to get a formal end to the Korean war (not just a cease-fire). He wants a halt to the joint military exercises put on twice a year by American and South Korean forces, including simulated nuclear strikes. He wants an end to punishing sanctions. So when he says, “denuclearization,” if he means it at all, he’s talking more about getting rid of American nukes (including those at the disposal of South Korea) than getting rid of his own. And he may just figure that if he can get Trump committed to a huge summit meeting, the president may be so rash as to agree to something that would be a huge victory for Korea. Already, in fact, Kim has gone a long way toward getting what he has long sought: using his nuclear testing to induce the United States to recognize him and his country as equal enough to deserve a summit meeting.
            But enough speculation. Here I would like to simply provide the background I adduced way back in September when the explosions from both sides were giving the world daily agita. This is important to keep in mind when, as no doubt will happen, our media outlets drum up the heart-stopping “US v THEM” dramatics should a summit actually take place. Hence, this blog, first posted on September 5, 2017. It’s titled, ‘Who’s ‘Begging for War?’

As North Korea ups the ante once again, this time with a massive nuclear blast that some observers (and the North Koreans themselves) are calling a hydrogen bomb, the rhetoric coming out of the Trump administration gets more belligerent by the minute. As I noted in a previous blog, the two adolescent leaders—Kim Jong Un of North Korea and Donald Trump of the U.S.—are engaged in a pissing contest. ‘My dick is bigger than yours; look how far my piss goes.’ Only it’s not piss that’s being compared; it’s weapons of such massive destructive power that most rational humans shudder to even contemplate their use. But not the Donald. “What’s a nuke for, if we can’t use it?” he once said. Most recently, Nikki Haley (who, before becoming our UN Ambassador, seemed semi-rational) has been uttering nutter phrases like “We have kicked the can down the road enough. There is no more road left…” No more road for diplomacy, is what she seems to mean, especially considering that she also said Kim Jong Un is “begging for war.” The President himself tweeted much the same thing, berating the South Koreans for their “talk of appeasement with North Korea,” which will “not work, they only understand one thing.” That is, violence of the nuclear variety.
            And as we all look on in horror as nuclear Armageddon looms ever closer, we have to ask: Just who is it that’s begging for war? Can the world really believe that North Korea, a nation of 24 million people whose economy seems permanently hobbled, and whose military, while large, would be no match for that of the United States and South Korea combined (the South itself may have nukes in its huge military arsenal supplied by the United States), actually wants a war? Or is it rather Donald Trump—he whose administration has lurched from one failure to another without a single legislative victory, with an approval rating that’s the lowest of any president in modern history—who is really searching for a ‘wag-the-dog’ solution to distract us all from his mounting problems?
            To really probe this question, especially the one concerning what exactly Kim Jong Un thinks he’s doing with his rockets and nukes, we need to know a bit about history (which most Americans, especially their idiot president, do not). My source is an article that appeared on consortiumnews.com last week: “How History Explains the Korean Crisis,” by distinguished historian William R. Polk. In it, Polk makes sense of North Korean belligerence by detailing the long history of invasions Koreans have suffered, starting in 1592 when Japan invaded and controlled the country for a decade or so. The Japanese invaded again in 1894, and this time set up a ‘friendly government,’ thereby ruling Korea for the next thirty-five years. It was in this period that many Koreans fled the country, including Syngman Rhee (the first president of the South) who fled to America, and Kim Il Sung (the first leader of the North) who fled to Russia-influenced Manchuria, where he joined the Communist party. By WWII Japan had reduced many Koreans to virtual slaves (thousands of Korean women became “comfort women” or concubines for the Japanese Army). But what’s fascinating to me is what happened to some of those Koreans who became rulers in the post-WWII period. Syngman Rhee, long resident in the United States and ‘Americanized’ (not to say ‘Christianized’), was set up as the first president of the new South Korea (North and South were vaguely established by the UN in 1945, but Rhee officially became the South’s ‘president’ with American help in August of 1948). He ruled basically as a U.S. puppet, with the United States military assuring his continuance by sending thousands of U.S. troops to support him, and American industry assuring the economic rise of his part of the country. According to William Polk, “Syngman Rhee’s government imposed martial law, altered the constitution, rigged elections, opened fire on demonstrators and even executed leaders of the opposing party.” His successor (via a military coup in 1961), Park Chung Hee, spent the war in Korea, but by collaborating with the Japanese occupiers (he apparently even changed his name to a Japanese one). His rule as President was so vicious that he, too, was overthrown (assassinated by his intelligence chief, 1979) and replaced first by Choi Kuy-hah, who was then deposed in a military coup by Gen. Chun Doo-hwan, who himself immediately imposed martial law that closed universities, banned political activities and throttled the press. A book I’ve read recently, Human Acts by renowned Korean novelist Han Kang, dramatizes the university and high school protests of 1980 in Gwangju that were savagely put down by Chun Doo-hwan, who ordered soldiers to coldly shoot over 600 young protesters and bury them in mass graves.
            By contrast, Kim Il Sung, who was to become the leader of North Korea after WWII for no less than fifty years, spent the war years as a guerilla fighter influenced by the Russians, where he led the resistance to the Japanese occupiers. His status as a hero was established there, and he soon became the first Prime Minister of the North, which he declared as a state in 1948, with ambitions to reunite North and South (Syngman Rhee had announced the same intention, as ‘reunification’). But when Rhee declared that the South was a fully independent state, Kim Il Sung saw it as an act of war, and (once China had agreed to take responsibility for the outcome) ordered his army to invade the South. Far better equipped and motivated than the southerners, Sung’s army took possession of Seoul, the South’s capital city, within three days, on June 28, 1950. By this time, the U.S. had persuaded the UN Security Council to protect the South, and organized 21 countries to send troops (though Americans made up the bulk of the forces in what was called a “police action”). Still, Sung’s military drove the southern army all the way south to the city of Pusan, where, by August, the southern army “held only a tenth of what had been the Republic of Korea.”
Here the situation was saved for the South only by the brilliant counterattack led by General Douglas MacArthur, who made a storied landing at Inchon, where, behind enemy lines, the Americans were able to cut off the Northern army from its bases. That led to a further attack by the South, which retook Seoul, and then moved across the 38th parallel (the dividing line between North and South) and drove nearly to the Chinese frontier. This brought China into the conflict, and, with what it called a 300,000 man “Volunteer Army,” overwhelmed the South Koreans and drove the Americans out of the North. At this point, General MacArthur urged President Truman to use fifty nuclear weapons to stop the Chinese, but Truman instead replaced MacArthur and continued the more or less conventional war. Except that it was not at all conventional for the North. U.S. carpet-bombing devastated the North with more tonnage (including chemical weapons) than had been used against the Japanese in all of WWII. Analysts today estimate that the North lost 33% of its population through this bombing—one of every three North Koreans perished. As Polk puts it, Korea proportionally suffered roughly 30 times as many people killed in 37 months of American carpet-bombing as these other countries (Britain, France, China and the U.S.) lost in all the years of the Second World War.” This may help explain why North Koreans generally favor their government’s stance to repel invaders at all costs: most have experienced the utter devastation of war firsthand.
            Finally, the North agreed to negotiate a cease-fire to end the stalemate (the state of war between North and South still exists), with the country divided at its 38th parallel by a demilitarized zone to keep the armies separate and to keep ‘new’ (i.e. nuclear) weapons out of the peninsula. Unfortunately, the United States, in 1957, violated article 13(d) of the agreement:

In June 1957, the U.S. informed the North Koreans that it would no longer abide by Paragraph 13(d) of the armistice agreement that forbade the introduction of new weapons. A few months later, in January 1958, it set up nuclear-tipped missiles capable of reaching Moscow and Peking. The U.S. kept them there until 1991. It wanted to reintroduce them in 2013 but the then South Korean Prime Minister Chung Hong-won refused (Polk op. cit.).

Thus, we see that it was the United States that decided to introduce nuclear weapons to the Korean conflict. But what about the North Koreans and their nukes? Even here, Polk points out, both the South and North had agreed to abide by the Nuclear Non-Proliferation agreement (1975 and 1985 respectively), but both violated the agreement (South Korea covertly from 1982 to 2000; North Korea in 1993, withdrawing totally in 2003.) Polk also adds that the precipitating event for the North’s withdrawal and its underground testing begun in 2006, was George W. Bush’s January 2002

Axis of Evil speech, in which he demonized North Korea. Thereafter, North Korea withdrew from the 1992 agreement with the South to ban nuclear weapons and announced that it had enough weapons-grade plutonium to make about 5 or 6 nuclear weapons (Polk op. cit.).

            This brings us to today. As noted in a recent article (Mel Gurtov, “Echoes of Reagan: Another Nuclear Buildup,” commondreams.org: 9.3.2017), the United States currently has about 6,800 nuclear weapons (roughly 1,400 strategic weapons deployed, the rest stockpiled or retired). Among these, the 920 missile-launched nuclear warheads deployed on 230 invulnerable submarines, are alone “enough to destroy an entire country and bring on nuclear winter.” By comparison, North Korea may have about a dozen nuclear weapons (some analysts say they could have as many as 60), most of them about the size of the ‘paltry’ nukes that devastated Hiroshima and Nagasaki. It also has the still fairly rudimentary missiles it has been launching with frequency this year, very few with the ability to strike the United States or anywhere near it. So what is this business about the North “begging for war?” It is an absurdity. What the North is really after is simple: survival; a resolution of the South-North war, which has been ongoing since the armistice in July 1953; and, related to that, an end to the huge and provocative war games that have been carried on for the last three weeks. These ‘games’ go on twice a year, and are clearly designed to threaten the North by simulating an invasion of North Korea and a “decapitation” operation to remove Kim Jong Un. What would the United States do if Russia were to carry on war games from Cuba or Mexico? We already know the answer to that. Yet despite the continuing pleas of the North to the U.S. and South Korea to cease these provocative military exercises, the U.S. and its protégé have persisted and even expanded them ever since the end of active fighting. In addition to these regular war games, recently the United States has sent groups of F-35B fighters, F-15 fighters and B-1B bombers on military operations over a training range near Seoul, where they dropped their dummy bombs to simulate a nuclear strike. According to Mike Whitney (counterpunch.org, 9.4.2017, “What the Media isn’t telling you about North Korea’s Missile Tests”),
The show of force was intended to send a message to Pyongyang that Washington is unhappy with the North’s ballistic missile testing project and is prepared to use nuclear weapons against the North if it fails to heed Washington’s diktats.

That’s it exactly. For this is the way the world according to American empire works: we can hold threatening war games, we can surround you with nukes from submarines and bombers and missile launchers, we can insult you and threaten you and starve you and humiliate you and refuse to end our war against you, but if you dare to stand up to our bullying, we will destroy you. And it’s your own fault for defying our ‘rule of law.’
            But of course, the North sees through this. Kim Jong Un may be a clown in a funny haircut who’s trying to prove he’s a big boy now, but he’s no dummy. His nuclear response to the threats from the United States, when considering his people’s history, and his knowledge of recent history, is perfectly rational. As Mike Whitney points out, “Kim has no choice but to stand firm. If he shows any sign of weakness, he knows he’s going to end up like Saddam and Gaddafi.” To remind you, Muammar Gaddafi of Libya finally decided to take the West at its word, and give up his nuclear plans. He was thereafter the victim of an invasion by western powers and ended up publicly violated in a gruesome death, mocked by American leaders like Hillary Clinton: “We came, we saw, he died.” Ditto Saddam Hussein of Iraq, whose country is in ruins. Kim Jong Un would clearly like to avoid that fate. He and his people would like to avoid being bombed back into the Stone Age, again. And so they are gambling that the blustering primate in Washington will either run true to his cowardly form, or be persuaded by calmer and more rational minds to see if there might not be an opening for negotiations. In fact, we all have to hope that this is the case. China and Russia also hope that this is the case, proposing once again (as they did in March and often before that) that in exchange for a halt to the military exercises by American and South Korean forces, North Korea could be persuaded to freeze its nuclear and missile programs. Surely, there is the germ for a diplomatic agreement here. Even South Korea’s new president, Moon Jae-in has just reiterated his offer to hold peace talks with North Korea (newsweek.org, 9.5.17) in what he has called his “Sunshine Policy.”
The only real question is whether the United States, and especially its wacky president, will ever agree to stop the war games. Because, after all, we are the Americans, the big dogs, who don’t back down, don’t negotiate unless it’s totally on our terms. Which in this case, means: do what we say not what we do, give up your nukes, and we can discuss the terms of your unconditional surrender. Anything short of that is “begging for war.”

            Addendum, Mar. 10, 2018. Since writing the above, I have read Min Jin Lee’s novel, Pachinko. In it, Lee narrates a saga of a Korean family that, during the pre-WWII period, moves to Osaka, Japan to seek its fortunes there. What we learn is the agonizing plight of Koreans in Japan (and in Korea under Japanese colonization) who are always and everywhere discriminated against as “lazy” and “morally defective,” even when successful and allowed to become Japanese citizens. This helps to explain both the never-healed antagonism between Koreans and Japanese, and the never-outgrown defensiveness of Koreans that pertains to this day—including, perhaps, some of the behavior of Kim Jong Un himself.

Lawrence DiStasi