Thursday, December 29, 2011

Monsanto's Killing Fields

I have recently watched an interview with Dr. Don Huber, Emeritus Professor of Plant Pathology at Purdue specializing in microbiology, that will curl your hair (see the whole 57 minute interview at http://capwiz.com/grassrootsnetroots/issue/alert/?alertid=58601501, and/or another shorter interview with Huber at http://vimeo.com/22997532). It deals with Monsanto’s herbicide, Roundup (main ingredient glyphosate), and its growing panoply of Roundup Ready seeds which have been genetically engineered to resist the killing effects of glyphosate, thus allowing farmers to spray Roundup liberally, killing all other plants and allowing the Roundup Ready ones to ‘thrive.’ Roundup Ready seeds now in use include Soy (87% of the worldwide crop), Corn, Canola, Cotton and the recently-authorized-by-USDA Alfalfa and Sugar Beets (despite Huber’s urging to Ag Secretary Vilsack to delay the approval of Roundup Ready Alfalfa). Corporate Agriculture considers this a miracle of American science and a boon to farmers and profits and even our health (with Roundup Ready crops, we are told, fewer pesticides have to be applied; Roundup alone does the job).

Dr. Huber, however, informs us in his dry, unemotional style, that this is not merely a mirage, it is a con, a disaster, a crime against nature itself (my words, not his.) The reasons are legion. To begin with, glyphosate, the key ingredient in Roundup, is, like many other pesticides, a “chelator.” That means it binds or creates a barrier around essential mineral micronutrients which are critical to the very heart of life and growth, especially enzyme function. Most critically, it is not just plants that require enzyme function; all organisms and microorganisms need them. So, to cripple the efficiency of a plant’s mineral uptake is essentially to kill them, and to counteract this killing, Monsanto has genetically engineered seeds whose plants have an alternative pathway for the uptake of some of these essential nutrients. They can ‘survive’ the poison of Roundup thereby. But the key point is this: the plant that gets sprayed with Roundup, even the GMO plant, still gets dosed with large quantities of the Roundup sprayed upon it. So does the soil, with all its microorganisms. Thus you get crops that have glyphosate on and in them (the glyphosate goes to key parts, like the seeds), and soil whose microorganisms are damaged the same way—microorganisms, one of whose main functions is to fight diseases. According to Huber, there are already 40 newly-thriving pathogens on many of our crops—diseases that used to be managed. No more. What’s worse, since these GMO crops—especially corn, soy and alfalfa—are the main feed we give to our stock animals, they too are being affected. A botulism has been seen recently in the intestinal tract of cows because glyphosate in the feed is killing or disabling the normal organisms in the cow’s gut that used to fight it.

Now here is where it gets scary. As noted above, all life employs the same basic mechanisms. So if glyphosate impairs the gut ecology in animals, we can expect that the effect in human stomachs will be similar if not exactly the same. Studies have already been done showing that in virtually 100% of cases, stock animals are showing a deficiency in manganese (needed for its antioxidant properties, its role in protecting plants from disease, and its enzyme-activating role in digestion) due to the chelating effect of Roundup Ready feed. There are also studies showing high levels of glyphosate in animal manure (so how use this animal manure on crops???) due to the feed they’re eating (all those Roundup Ready crops, now to include alfalfa), plus some new organism from hell (the electron microscope image of this thing is terrifying) that is suspected of causing reproductive failure in farm animals. First noted by vets in 1998, the fertility failure rate in dairy cattle has reached such proportions—45 to 70%--that dairies now worry about maintaining their stocks (note that Roundup Ready soy and corn were first used as feed in 1998.) And of course, the real killer in all this: though hordes of researchers are trying to identify it, no one yet knows what it is. It seems to be about the same size as a virus, it can be cultured, it is self-replicating, there’s lots of it in GMO corn and soy, but scientists don’t know what it is.

Thus publication about its causes is lagging, but not just for reasons of uncertainty, and here the lugubrious Dr. Huber got as animated as he allows himself to get. Monsanto controls the science in this area. Anyone who does research that is not favorable to its products is either silenced, fired, or prohibited access to all products under patent. Why? because Monsanto makes such research illegal. And the EPA and the FDA and the USDA all go along with it, because the lobby for big agribusiness controls the Congress and all government agencies dealing with such products.

So here’s what we’ve got, folks. We’ve got the most widely-used herbicide in the world killing not only plants (including impairing their ability to fix nitrogen from the air) but essential soil microorganisms and beneficial intestinal organisms in animals. We’ve got GMO crops that allow that product to be used in higher concentrations than ever because we’re told the magic of GMO somehow keeps the stuff off the GMO crop in question, when it doesn’t. And we’ve got symptoms now of some new monster organism that seems to be spreading abortions, infertility and premature aging among the farm animals (not only cattle but chickens, pigs and horses too) on whom we depend. And we haven’t even talked about glyphosate’s proven record as an endocrine disruptor—Huber mentions, in this regard, the notably lowered sperm count of human males, less than half of what it was only 20 years ago. And above all, we have a totally compromised government and its agencies supposedly protecting us, but keeping themselves busiest blackballing and outlawing the science and the scientists who have been trying to sound the alarm about all this.

I don’t know about you, but I’m about ready to call for all-out war on Monsanto, on the USDA and Ag Secretary Tom Vilsack, and everyone else involved in the monstrous system we have allowed to thrive. I’m about ready to sign on with Lierre Keith, who recently called for serious radical action—whatever force it takes—to bring the entire sick system down. I mean, what else is there to do? Appeal to their better nature? It is to laugh, because these people—the CEOs, the so-called scientists Monsanto employs, the toadies in Congress who protect them to keep their state revenues jangling—are willing to poison every living thing on earth in order to maintain their stranglehold on the markets they have cornered. They care not a whit for life—plant life, microbial life, animal life, human life. They care only about killing. Why should we care about them or their sick, profit-driven lives? Why should we not begin a movement that a recent Newsweek column (Newsweek! imagine) predicted would include the following:

There will be prosecutions and show trials. There will be violence, mark my words. Houses burnt, property defaced. (Michael Thomas, Newsweek, Dec. 28)


And I believe there will. And if it targets any of the evil bastards who work, in any way shape or form, for the devil-spawned corporate monstrosity called Monsanto, I for one will cheer and salute and encourage it until the beast is choked on its own deadly brew.

Lawrence DiStasi

Friday, December 16, 2011

Incarnation

As I’ve come to expect in this “Season of Joy,” my mood has been growing more gloomy as the season progresses. Too many “Christmas Specials” with too many expected songs; too many commercials urging us to ‘hurry: only a limited and steadily decreasing number of shopping days left’; too much of the sense that increasingly each year the remembered spirit of this once-holy season becomes more and more degraded by the over-hyped orgy of conspicuous consumption it has become.

Then this morning, a possible turn. Though I have long since abandoned the theology the season supposedly represents—the virgin birth of a God called Jesus in a manger marked by a star—the underlying mystery is both profound and worthy of contemplation. I mean the idea of incarnation. Christian (in my case, Catholic) teaching makes a good deal of this: God comes to earth to save us (that’s the big takeaway) by incarnating: he deigns to become flesh, he takes human shape, as one of us. That’s what the joy is supposed to be about: God himself, or rather, his only begotten son, has come to be us all, to save us all. The problem is that this is hyped as something fantastic, something special, something that has happened only once in history, with the corollary that we, the chosen ones, are the only ones who know this and can thereby benefit from it. That’s where the bullshit creeps in. Because incarnation really is a big deal, only not in the manner of something special, something unique to us fortunate humans of the Christians persuasion, who alone will ride to heaven on its back. No. It’s a big deal because it is the great mystery at the center of all our lives, of all life, of all being. Incarnation. Something becomes flesh. Something that is presumably without substance, i.e. nothing, becomes something. And that is a big deal.

Now humans have long noticed this, have long made it a central mystery. A plant appears out of the ground in the spring. Miracle. Mystery. Repeated millions of times. Millions of fishes sprout from the sea: mystery; gazillions of bugs appear in flight from nowhere, as do thousands upon thousands of birds and gophers and all the beasts of the field. Miraculous, and beneficial to us, mostly, the humans who must depend upon crops and flocks and fishes. And so arise the mystery cults, the stories of Demeter and her child Persephone miming the miracle of birth of all nature in the Spring. And of course, in the Christ story, a child bursts forth from a virgin womb, signifying not only the miracle of human birth, but the mysterious birth of God himself. The mystery of incarnation. The problem is that we now know too much to be awed by this anymore, to genuflect or sacrifice to it anymore. We know how plants arise from seed. We ‘know’ that they convert energy from the sun via photosynthesis, and from the soil via mineral transport, and grow cell by cell. We know how humans and all other animals are conceived, via sperm and egg and growth by cell division, all governed by those helical strands of DNA. So the old mysteries, the pretty stories, become myths—tales told by the ignorant to explain processes too deeply embedded in tiny events for the ancients to perceive. And we abandon them, we replace mystery with the “holidays” whose chief purpose is to get us to spend lavishly and keep feeding an economy which depends for its continuance on the utter stupidity of our buying what neither we nor anyone else needs. (I should say that when I was young many years ago, Christmas still had, at least for us, the quality of need: we got coats or boots or gloves we sorely needed to replace outgrown or worn-out ones; and for something impractical, an orange or tangerine that in winter, in the northeast, still had the aura and taste of a rarity.)

But I digress. I was saying how we’ve abandoned the mysteries that are no longer believable—except I suppose in art, like Handel’s Messiah, which still, despite our knowledge, retains some power. But I digress again. What I meant to say, to remind myself, is that incarnation, even stripped of all its mythological trappings by our science, still radiates power. Indeed, it remains the central mystery. And we can, at least partly, thank science for that too. That’s because while rational science has swept away all the “myths” with its penetrating revelations of biology at the cellular level, when it goes deeper, and it has gone deeper, it brings us right back to the mystery again. Though it has shown us, objectively, what happens at the molecular level and even at the atomic level, at the quantum level things get spooky again, mysterious again. That is to say, at the quantum level, we are now told (and virtually none of us can verify this ourselves) that much of elementary matter—those teeny tiny components of atoms and even electrons, with names like quarks and leptons and gluons and bosons—simply appears out of the void. Matter at its most elementary level, the things of which we are made, simply pop into existence and then pop out again. And we don’t know why. Physicists have, of course, named this. They call it “quantum fluctuation” (see www.newscientist.com, “It’s confirmed: Matter is merely vacuum fluctuations” by Stephen Battersby). They even attribute the birth of the universe, our universe, that is, to quantum fluctuations (no deity needed) which initiated the process leading to the big bang, which burst in this unimaginably fierce explosion to send all those compressed bits careening out into what has become our universe, inflating and expanding faster and faster until gravity gathered things together to produce galaxies and stars and planets and us.

And it all came from incarnation. Matter just popping into existence. Something from nothing. Here is how Stephen Hawking and Leonard Mlodinow put it in their recent book, The Grand Design (2010):

Quantum fluctuations lead to the creation of tiny universes out of nothing. A few of these reach a critical size, then expand in an inflationary manner, forming galaxies, stars, and, in at least one case, beings like us. p. 137.


Now I don’t know about most physicists, but to me, that’s pretty mysterious stuff. And it’s not just that I don’t understand quantum mechanics, which I don’t. The truth seems to be that nobody really understands it. There are formulas to explain things, and experiments that seem to prove it works, but when I read that multiple universes (the concept rather makes the word ‘universe’ an oddity) probably sprang from quantum fluctuations and the big bang, and that all those parallel universes probably exist somewhere; or that when particles split through a screen, there is the possibility that though some land where we can identify them, some have probably tripped out to the most distant corners of the universe; or that we and our whole universe may be a holographic projection of some outer surface of a black hole, well then I have to say that the great mystery of incarnation still exists. The great mystery, that is, is and has always been: why there is something rather than nothing? How is there something? Is there a where from which we and all else derive?

This, I think, is really what we should be pondering during this season. Incarnation. Whether we should be joyful about it or not I suppose depends, at least in part, on one’s situation. But it also depends on the very fact of being. It depends on the improbable fact that something rather than nothing exists. It depends on the fact that the void, the vacuum, the nothing has produced and continues to generate, every day, every hour, every second, every millisecond more stuff, more of this improbable glory, more impossible incarnation. And though keeping the stupid economy going does not deserve celebration, this, this continuous mysterious incarnation, this ongoing mystery of the word (or whatever it is) made flesh, surely does.

Lawrence DiStasi

Wednesday, December 7, 2011

Occupy Everywhere

Police nationwide having moved in, at this point (Police on Dec. 7 finally attacked and destroyed the Occupation in downtown San Francisco), the Occupiers in public spaces of dozens and dozens of American cities have been forced to leave. But is this the end, as many have feared?

Not quite. In what some have called the next logical and brilliant move, the Occupiers have shifted their locus (not their focus) to the core of the crisis: bank foreclosures of homes. As Stephen Lerner, an organizer with SEIU, says: “…we’ve occupied public space — now we need to occupy private space that’s been stolen by banks.” Sean Barry of VOCAL-NY adds: “One of our messages is that there’s more empty homes that banks are sitting on than there are homeless families.”

Still, some might think, ‘oh, foreclosures; that’s old hat, a story that’s over.’ But it isn’t. According to many insiders, the banks have yet to foreclose on the majority of homes in the U.S., perhaps as many as 4 million more. More than that, the AP reports in its story on foreclosure occupations that “Nearly a quarter of all U.S. homeowners with mortgages are now underwater, representing nearly 11 million homes” (CT Post, December 6). That’s 11 million homes, folks, 1 out of 4. Talk about the Great Depression. Which is what, by the way, Rachel Maddow did on a recent MSNBC show (well worth watching). As an introduction to her sympathetic segment on the Occupy Foreclosures movement, she showed news reports and movie clips of exactly the same kind of resistance during the early 1930s when millions of Americans were losing their homes and farms. Huge crowds would show up and resist not just passively or peacefully, but by first putting the furniture that had been removed by authorities back into the homes, and then by throwing rocks and utensils and farm implements at police arriving to enforce the evictions. These people were pissed off and they were serious.

So, it seems, are today’s occupiers. The AP report cited above claims that homes in more than 25 cities were involved in Tuesday’s protests. And more are on the way. Said one of the Seattle organizers: “It's pretty clear that the fight is against the banks, and the Occupy movement is about occupying spaces. So occupying a space that should belong to homeowners but belongs to the banks seems like the logical next step for the Occupy movement.” In response, Seattle police spokesman Sean Whitcomb insisted that occupying private property represented the same violation—trespassing—that occupying public space did. The police response, and the penalties, would be the same. But the occupiers are unfazed. In Atlanta, protesters disrupted a home auction of foreclosed properties with whistles and sirens. Several individual home foreclosures have already been stopped, and the evictees given more time to try to work out a deal with the banks. One woman in Cleveland expressed gratitude to the occupiers, who came and camped out in tents in her backyard, frightening off officials who were supposed to come and evict her. She was still in her home on December 6 (see Maddow video). Moreover, the Occupiers have joined forces with groups that have been active for several years (Take Back the Land, Viva Urbana) in defending homes against evictions—supplying fresh and enthusiastic troops for the earlier efforts. The movement also derives encouragement and tactics from movements in other countries like Spain, where the 15M movement has stopped hundreds of evictions and occupied vacant buildings.

That this movement has moral authority can be seen by what NY Times columnist Nicholas Kristof wrote in a recent interview with a Chase banker named Theckston:

He (Theckston) says that some account executives earned a commission seven times higher from subprime loans, rather than prime mortgages. So they looked for less savvy borrowers — those with less education, without previous mortgage experience, or without fluent English — and nudged them toward subprime loans.
These less-savvy borrowers were disproportionately blacks and Latinos, he said, and they ended up paying a higher rate so that they were more likely to lose their homes. Senior executives seemed aware of this racial mismatch, he recalled, and frantically tried to cover it up. (Kristof cited by Sarah Seltzer, Alternet, Dec. 5)


We’ve all heard accusations about this type of cruel and intentional fraud before, but to hear an admission of it of from one of the bankers involved is stunning.

This—the moral authority they have, both currently and historically—is why the Occupy movement has the powers-that-be scrambling for ways to de-legitimize it. I mentioned in my last blog the rumor about a public relations firm being hired by bankers. More recently, Republican talking-points guru, Frank Luntz, expressed his concern about it to the Republican Governors Association meeting in Orlando: “I’m so scared of this anti-Wall Street effort. I’m frightened to death,” he said, and offered 10 tips on what specific language to use to counter it. First and foremost, “Don’t say ‘capitalism.’ Use ‘economic freedom’ or ‘free market’ instead.” Now this is really interesting: even the Republicans are admitting that the American public now thinks ‘capitalism’ is immoral! And if Republicans are seen as “defenders of ‘Wall Street’,” says Luntz, “we’ve got a problem.” (Yahoo News, Dec. 1)

Karl Marx must be smiling. Imagine, the Republican Party, that bastion of mindless boosterism, is running away from capitalism as a concept. Moreover, Luntz also advises Repubs not to say government ‘taxes the rich;’ instead say government ‘takes from the rich,’ because Americans respond favorably to ‘taxing the rich.’ By God, I sure hope the clueless, pusillanimous Democrats have read this. Because Luntz urges other verbal subterfuge as well, and all reflect two things: the Republicans are vulnerable and scared (as well they should be, their policies having brought this nation to the brink of disaster), and at the other end, have thoroughly absorbed the lessons of the TV age about framing a message properly, while Democrats have not. Now, finally, there’s a golden opportunity to hang the Republicans with the real message and practice they and their financial masters have been promoting for years: advancing the cause of the 1% at the expense of the 99%.

So far, the only element in the nation that has understood this, and been willing to act on it, are the Occupiers. We can only hope that the American people in ever greater numbers will begin to get it as well, and that the hapless talking heads they elect to public office will follow. The only question is, how much of everywhere has to be occupied and how many of the rest of us have to be jailed before the worm turns?

Lawrence DiStasi

Sunday, November 20, 2011

Occupy, Occupy, Here Comes Occupy

I’ve been wanting to comment on the #Occupy movement for quite some time, but events keep outrunning my prose. That’s still true today. So this is just going to be some disjointed musings to emphasize how delighted I am with these young people—the ones who’ll have to live in the mess we’ve created—and how crucial I think their movement is. Just consider: a few weeks ago, the wacky right seemed firmly in command of the entire political spectrum. Obama was reeling from hits to every one of his proposals, no matter how lame. All we heard was the Tea Party and the rantings and ravings of the Republican pretenders to the White House: Tweedledum and Tweedledumber-by-the-minute (I mean really, has there ever been such a gathering of cruel, incompetent morons in a presidential primary?)

Now, though, the #Occupy movement in city after city has changed all that. Just this morning, for example, I read a piece about the latest initiative in Congress: Ted Deutch, (D-FL) has offered a constitutional amendment (he calls it OCCUPIED: Outlawing Corporate Cash Undermining the Public Interest in our Elections and Democracy) to affirm that “rights protected by the Constitution belong to human beings, not to for-profit corporations or other business entities.” It would “prohibit business corporations and their associations from using money or other resources to influence voting on candidates or ballot measures anywhere in America.” Amazing. The Democrats in Congress are clearly feeling the heat from the occupiers, and some, at least, are starting to find some damn backbone.

Of course, it won’t be enough. But this is what such movements are supposed to do: change the debate, and force legislators to act rather than hide behind mealy-mouthed rhetoric. And just before this, I watched a video of a few dozen occupiers marching—on foot, along the highway where people can stop and congratulate them—from New York to Washington. They plan, according to some of their interviews, to barge in on the deadlocked “Super Committee” that’s supposed to be coming up with compromise measures to reduce the deficit. Of course, this “stupor committee” will do nothing of the kind, but the occupiers are pushing ahead, getting some press, and dramatizing the determined inaction of the U.S. Congress.

Even before that, I read the beautiful op-ed written (NY Times)by former poet laureate Robert Hass about his encounter with the police at UC Berkeley’s occupy gathering last week. In brief, Hass and his wife, poet Brenda Hillman, decided to monitor police behavior the night they were to remove the occupiers from UC’s Sproul Plaza. Instead, the Hass’s found themselves stuck in a crowd being forced together, and when Hillman sought to engage a policeman in dialogue, he struck her to the ground, also striking Hass when he tried to come to her aid. Hass, nursing bruised ribs, decries the militaristic tactics of the Darth Vader forces that have attacked, without provocation, the occupiers from New York to Denver to Oakland to San Francisco in what many see as a coordinated attempt to intimidate the occupiers, break their movement, and discourage any others who might be thinking of joining them. It hasn’t worked so far. Each broken-up demonstration has simply come back stronger—a fact we learned in the 60s, i.e. that inducing the authorities to overreact is part of revolutionary strategy. And these days, i.e., post-9/11, one hardly has to induce at all. The militarized police forces—the equipping of whom has become a booming industry for America’s military-industrial complex—seem to all be either on hair-trigger alert, or specifically instructed to beat the hell out of a few hundred demonstrators, regardless of provocation or law-breaking, to send a message. Fortunately, the message is having the opposite effect. Police brutality is encouraging, rather than discouraging more people to join the movement. And if polls are correct, millions of Americans, like myself, are cheering them on from the sidelines.

The police will, and already have scaled back their brutalities—especially after the horrific video of a helmeted officer walking calmly back and forth spraying pepper gas directly on a sitting group of UC Davis students blocking a sidewalk; which spraying called forth condemnation and an investigation by the UC Davis Chancellor. But things have gone very far already, and the police, like all authorities, are fixed in their attitudes. Crowds threaten them. Protest types disgust and alarm them. Used to intimidating, used to immediate compliance with their orders no matter how unreasonable, their responses are virtually automatic (their force has been rationalized by one spokesperson who said “linking arms is a form of violence”). Indeed, the conflict between police/soldiers and unarmed demonstrators has become the emblem of our time—in Tunisia, in Egypt, in Yemen, in Burma, in Libya, in Syria. The only question in any such situation is how far these “upholders of law and order” will go to snuff out the legitimate cries of the suffering.

And this is why, in the end, the #Occupy movement is so important. Ordinary people, mostly young people, are demonstrating that the situation—of inequality, of organized theft, of corporate malfeasance, of ecological disaster—has become so dire that they are willing to put their bodies on the line to change not just rhetoric, but everything. Even former lawmen—I know of two who have recently joined the occupiers, Ray Lewis, former police chief of Philadelphia (arrested), and Norm Stamper, former police chief of Seattle—are adding their voices to the rising chorus. Where all this will end is anybody’s guess: it could fizzle in the cold and wet. But one thing is sure. Those in power are taking note, and planning furiously to deflect the movement, infiltrate the movement, discourage and discredit the movement (this just in: Reader Supported News is reporting that a well-known DC Lobbying Firm has proposed an $850,000 plan to conduct ‘opposition research’ on the Occupy Movement and construct ‘negative narratives’ about it. See it at readersupportednews.org). There is fear in their hearts, because they know that the movement has focused on the one truth that cannot be denied: We really are the 99%, and without our cooperation, they cannot maintain their exploitation of the masses. For that alone, I salute the occupiers. And hope, when the time is ripe, to join them.


Lawrence DiStasi

Wednesday, November 16, 2011

One Continuous Mistake

I’ve recently read an inspiring book called Fire Monks, which tells the story of how five Zen monks from the San Francisco Zen Center, against all odds, saved the center’s monastery at Tassajara, near Big Sur during the raging forest fire there in 2008. In one segment, the writer, Colleen Busch, quoted a phrase that SF Zen master Suzuki Roshi liked to use: it referred to life, even a Zen master’s life, as shoshaku jushaku, Japanese for “one continuous mistake.”

That resonated with me. Being a perfectionist, I’m always trying to get everything just right. I think it’s a common ailment among humans: We think we can outwit life and, by doing things just right, insulate ourselves—from rain water coming into our houses, from waste water leaking out, from fire, from flood, from storm, from sickness and death, from all the shocks that flesh is heir to. We cannot. Everything we do, every decision we make is dogged by mistakes. Indeed, when looked at carefully, most life decisions are impossible decisions, impossible to get right, that is. We are constantly making mistakes, always falling short to one degree or another, except where we’re very lucky.

Even in baseball (I’ve also been reading A. Bartlett Giamatti’s profound study of American sport, especially baseball, Take Time for Paradise), the situation is the same. As Giamatti points out, America’s sport is a game dominated by failure. Most attempts to reach base safely, not to mention getting home, are failures. Even the .300 hitter, the game’s crème de la creme, fails to hit safely 2 out of 3 times. Which is to repeat that life is one continuous mistake; we work hard to realize our “dreams,” but most of the time, most of us fail.

I have been reminded of this lately while doing work on my house. To engage in this kind of work—carpentry, painting, roofing—is to engage in continuous mistakes. Measurements don’t work out. One forgets to consider some inner cut. A carpenter I used to work with years ago had a wonderful way of dealing with this: “The universe,” he said only half joking, “is off by a quarter of an inch. Measure all you want, you’ll always be off.” I think he was right. We humans insist on making right angles of the world, when in fact, nature is all circles and curves. So whole pieces of lumber or fittings get wasted. The paint is either too thick or too thin and the color never looks the way it did on the sample. And when it comes to painting it on, there are always “holidays,” or slips of the brush splattering glass instead of wood, and if you’re really paying attention, boards that turn out to be dry-rotted or termite-infested. One continuous mistake.

Nor is it any different in the writing business. Years ago at Harcourt Brace, a manuscript I was preparing for publication had book titles typed in capital letters—a convention for typed manuscripts before computers (typewriters had no italics). The editor was always supposed to convert such caps for the printer by marking them ‘ital’ and lower-casing all but the first letter. In the first book I edited on my own, I overlooked this little detail, so the book was printed with book titles in all caps. Humiliating; but I never mentioned it and no one seemed to notice. Writing one’s own books carries the same, or even greater hazards. The writer lives in fear of that mistake—the one that appears in the title, in chapter titles, in boneheaded misquotes, in whole paragraphs that get cut off. In fact, I have never written an essay for publication that didn’t get mangled by the publication in one way or another. Most people never notice, but the writer does, living always with the knowledge that no book, no piece of art is ever perfectly rendered, and it haunts him. One continuous mistake.

Even when a star rises to the top, how many are there who avoid conspicuous, often fatal mistakes? Consider the last few presidents we’ve had—successes in the most exalted sense, having made it to the highest position available in America, or, for that matter, the whole world. And yet, think of the recent ones: Lyndon Johnson resigning in frustration over the Vietnam War, with war protesters shouting outside the White House, “Hey Hey LBJ, how many kids have you killed today?” Richard Nixon following him, with a great landslide victory over Humphrey, with a re-election landslide over McGovern, which triumph led to encomiums about his political genius; and within weeks had him embroiled in the greatest scandal in American history, Watergate, and within months, with impeachment looming, resigned in disgrace. Consoling himself only with that pathetic mantra, “I am not a crook.” Then Ford pardoning Nixon, another scandal, mouthing his presidential mantra that no one is above the law, but of course sometimes the law must accede to “reality” (i.e. power). Jimmy Carter taking office next, with high hopes, but shortly after his major achievement at Camp David, confronted with the Iran hostage crisis that drove him from office in ridicule. Then Reagan: Mr. “Morning in America.” But before too long, mired in his own disgrace, Iran-Contra, confirming him as lawbreaker-in-chief, and in hindsight seen by many as the architect of the long-term collapse not so much of the Soviet Union, but of American capitalism itself. Then G. H. W. Bush faltering on every level, shortly after claiming a transformative victory over Iraq. And Clinton with a few victories in the economy, but so unable to control his wee wee he ends up barely fighting off impeachment, his legacy “I never had sex with that woman.” And W; need we say anything about W? Non-existent weapons of mass destruction as justification for war? Deaths in the millions for what? To create the worst economic disaster since the Great Depression? And now Obama, failing utterly to fulfill the promise, siding with the worst elements that brought the nation to the brink, abandoning those who swept him into office, and unable now to get even a small bill through Congress.

In short, the U.S. presidency, not to mention the current Congress, is one continuous mistake.

The trick, for the zenist, and for all of us, is how to come to terms with this, with this knowing that no matter how carefully one proceeds, mistakes are continuous. (The obverse is that in zen, it is said, one can never make a “wrong” decision; or a “right” decision, for that matter. One just does what is needed at the time.) Or, as Suzuki Roshi used to say, just commit and do your best: ‘It’s the effort that counts, the sincere commitment to wake up, wherever you are. That’s all anyone can ask.’ Which is also to say, awake to what life is. Despite the assurance of our myths, life is not success; life is not progress; life is not keeping the rain out completely (other animals simply get rained on; and don’t melt). Life is one continuous mistake. Which is pretty much how it proceeds. Mistake after mistake, leading over time, perhaps, to a little tinkering here, a bit of tinkering there with just enough of us getting by to keep it going, the only success being its continuation; our continuation. It goes on; we go on: the entire creation. Which is about all we can say; and isn’t that, mistakes and all, miraculous enough?

Lawrence DiStasi

Wednesday, November 9, 2011

Wake-up Time?

I have to confess that I didn’t go to the polls yesterday, having glanced at the sample ballot to find mostly school bond issues of little interest to me now. But across the country, what an election it was. Though it may be too much to hope, it seems that our great unwashed are finally waking up to the fact that capitalist democracy, in its present form, is not going to save them. Rather, the oligarchs and banksters and Wall Street billionaires now in control of both the economy and the political process will never be satisfied until they have ground the faces of the working classes into the dirt, stripped them of all dignity, and forced them to shut up, watch the circus, and become slaves. But wait: enter the Wall Street occupiers who, contrary to all expectations, seem to have changed the conversation, and now, voters across the country have shown that they, too, are fed up.

In Ohio, where Governor John Kasich had emulated his Republican counterpart in Wisconsin by pushing a law, SB 5, that stripped public sector unions of their right to collectively bargain, the voters repealed the law in a huge victory for union rights. Over 60% of voters stood with nurses, teachers, policemen and firefighters in a victory that had the Ohio governor sheepishly acknowledging that he had “heard the voters.” I just bet he did. I bet the smart-ass governor of Wisconsin heard too. Perhaps even the billionaire Koch brothers, who financed much of this concerted Republican attack on workers, heard it as well. Because this wasn’t the only reversal for the conservatives who just months ago appeared poised to take over the whole nation.

No. Progressive victories took place in several more states, including Maine, Mississippi, Iowa, Arizona and North Carolina. Something is happening here, Mr. Jones. In Maine, the people voted to maintain their same-day voter registration policy after the right-wing legislature had passed a law to repeal it—employing their usual argument about “voter fraud.” The people didn’t believe it, saw it as disenfranchisement, and yesterday took their right back. In Mississippi, voters struck back on a different front, rejecting another attempt by fundamentalists to pass a constitutional amendment granting “personhood” to a “fertilized egg.” That’s right. On the one hand, these right-wing bozos grant personhood to corporations; on the other, to “fertilized eggs”, thus putting at risk not just abortions, but even birth control. Even benighted voters in Mississippi said “no thanks” thank god.

But my two favorites, at least in the U.S., were Arizona and Missoula, Montana. In Arizona, the Republican state Senator who had pushed the state’s nasty immigration bill, SB 1070, one Russell Pearce by name, was recalled. Tossed out of office. The gopher for the notorious American Legislative Council (ALEC)—funded by corporate special interests including the aforementioned Koch Brothers—Pearce this morning was talking about having to re-examine his options after his big defeat. Which probably means figuring out how to maintain his racism by putting a more palatable face on it. No matter. He’s gone and SB 1070 should be toast. The Koch brothers suffered another defeat in Wake County, North Carolina where voters defeated four conservative school board candidates backed by the Koch’s “Americans for Prosperity” who wanted to get rid of the district’s diversity policies. In other words, to re-segregate the schools. The voters said no, and replaced them with Democrats. Why, it might even be called morning in America!

Finally, in Missoula MT, (site, incidentally, of the camp where Italian Americans were interned during WWII), citizens passed a resolution proposing to amend the U.S. Constitution to END CORPORATE PERSONHOOD. To me, this is potentially the most important victory of all. This is because the absurd notion that corporations are actually persons, with all the First Amendment rights granted to human beings by the U.S. Constitution—including and specifically free speech (the basis for the Supreme Court decision in Citizens United granting corporations complete freedom to throw money at any and all candidates for public office without restrictions)—makes a mockery of democracy itself. Corporations are fictitious entities. Persons organize themselves into corporations specifically to limit their liability as individual humans in business dealings. That limited liability is granted because it allows corporations to do what individuals cannot—so to then turn around and grant a fiction with immunity the same protections as vulnerable humans is an absurdity. Further, the Supreme Court itself never actually decided on this issue; it was a clerk working for the court, J.C. Bancroft Davis, who added a headnote to the 1886 Santa Clara case that assumed the personhood of corporations—a headnote that slipped by and became precedent ever after. In other words, corporate personhood should never have had the force of law. Since it does, however, the remedy is to pass a constitutional amendment to bring the situation back to where the Founders—Jefferson, Madison, and others who insisted that it was the people who needed protection from corporations—initially put it. Humans have human rights. Corporations do not, except in the fictitious world established in the United States in recent years. As one sign in the Occupy movement put it, “I’ll believe corporations are persons when Texas executes one.” It is time to abolish this so-called right, and the voters of Missoula, Montana took a first step. My hope is that before too long, the entire nation will wake up as well, and take the necessary actions to put corporations and their power back in the bottle where they belong. If, that is, it isn’t already too late—which it will be if now all of Italy comes undone (Berlusconi’s downfall another victory), joining Greece, and the whole Eurozone follows suit. Then, it might be too late not only to save Europe, but to save capitalism as well.
First things first, though, and today we can raise a glass to some small, but significant victories. May they continue.

Lawrence DiStasi

Friday, September 30, 2011

Greed Be Good

In 1965, Alan Greenspan wrote:

“It is precisely the greed of the businessman, or more precisely, his profit-seeking, which is the unexcelled protector of the consumer” (Madrick, 228).


This should really be the epitaph inscribed on the tombstone of the American economy. Far from ‘protecting’ consumers, the greed that has defined American business and especially Wall Street these last 40 years has decimated the economy, loaded businesses with debt, put millions of Americans out of work, and transferred huge chunks of American industry to foreign countries such as China. Therein lies the theme of Jeff Madrick’s crucial book, The Age of Greed, (Knopf: 2011). To read it, with its portraits of banksters and junk bond traders and acquisition specialists and CEOs of America’s largest corporations, is to learn of chicanery, conniving and contempt for average Americans on such a scale as to sometimes deceive the reader into thinking he is reading Dante’s Inferno. Such characters—some of the mightiest names in corporate and political America in the latter years of the 20th Century, names like Rubin and Weill and Reagan and Greenspan and Friedman and Milken and Boesky and Welch—do deserve a poet like Dante to fix them in an appropriate level of pain and torment. While Madrick is not that poet, he does a creditable enough job of this to sicken even the most cynical reader, for his is the tale of the outright looting and crippling of the American industrial might (along with its workers) that was once the envy of the world.

The book begins with the general proposition that while industry and transportation and communications and retailing were once the foundations of American wealth and prosperity, “by the 1990s and 2000s, financial companies provided the fastest path to fabulous wealth for individuals” (24). And where government was once seen as a needed supporter and regulator of such enterprises, Milton Friedman’s economic doctrines, put into saleable form by Ronald Reagan and Alan Greenspan, turned government into the enemy. As Friedman wrote, “The fact is that the Great Depression, like most other periods of severe unemployment, was produced by government (mis)management rather than by the inherent instability of the private economy.” The answer to all problems, in this tortured view, lay not in government actions to help those who need it, but in reducing government and lowering taxes so as to (allegedly) make the poor better off, eliminate inequality and discrimination, and lead us all to the promised free-market land. As noted above, Alan Greenspan believed wholeheartedly in these and other theories (especially those espoused by his friend Ayn Rand), and Ronald Reagan became the shill for selling such pie-in-the-sky nonsense to the American public. As with his sales work for General Electric, Reagan marketed the kool-aid more successfully than anyone could have anticipated. In office in California as governor, he blamed welfare recipients for the state government’s financial problems: “Welfare is the greatest domestic problem facing the nation today and the reason for the high cost of government.” When he got to the national stage with inflation rampant, he hit government profligacy even harder. “We don’t have inflation because the people are living too well,” he said. “We have inflation because government is living too well” (169). All this was coupled with his mantra that getting back to the kind of “rugged individualism” that had made America (and himself) great required reducing taxes. And reduce he did. From a tax rate that was at 70% on the highest earners when he took office, he first signed the 1981 Kemp-Roth bill to reduce it to 50%, and then, in 1986, with the Tax Reform Act, reduced it even further to 28%. Meantime, the bottom rate for the poorest Americans was raised from 11% to 15%, while earlier, Reagan had also raised the payroll tax (for Social Security) from 12.3% to 15.3%. This latter raise, it should be noted, coupled with the provision that only wages up to $107,000 would be taxed for SS, meant that “earners in the middle one-fifth of Americans would now pay nearly 10% of their income in payroll taxes, while those in the top 1% now paid about 1-1/2%” (170). And what Reagan never mentioned about his “rugged individualism” is that he was made wealthy by those rich men who cajoled him to run for office: his agent arranged for 20th Century Fox to buy Reagan’s ranch for $2 million (he had paid only $65,000 for it), giving him a tidy profit with which to buy another ranch that also doubled in price when he sold it.

But such tales treat only the enablers. It is when we get to the actual hucksters of this story that things get interesting (or nauseating, depending on your point of view.) The basic scheme went like this: find a company that is undervalued—often because it had managed its assets so well it had cash on hand—and acquire it, using debt to finance the takeover. Then make money—and I mean millions and billions—on all the steps involved in the takeover, including the debt service, the legal fees, and the rise (or fall) in the stock price. For in the age of greed that Madrick documents, the stock price was all. Anything that pushed the stock price of a company up was good. Anything that pushed it down was bad (unless you were one of those smart guys like hedge-fund ace George Soros who worked the “shorts”). And of course, the best way to get a company’s stock price to go up was to increase profits. And the best way to do that was not to innovate or develop better products, but to slash costs, i.e. fire workers. Here is how Madrick puts it:

American business was adopting a business strategy based on maximizing profits, large size, bargaining power, high levels of debt, and corporate acquisitions…Cutting costs boldly, especially labor costs, was a central part of the strategy. (187)


What began to happen in the 1980s and into the 1990s was that all companies, no matter how successful, became targets of the ruthless merger mania that replaced normal business improvements. Lawyers like Joe Flom and takeover artists like Carl Icahn and T. Boone Pickens could spot an undervalued, or low-stock-price company (the process reminds one of wolves spotting a young, or lame deer in a herd) to take over, using borrowed money to finance it (90% of the purchase price). The borrowing then demanded that the new merged company cut costs in order to service the huge debt required for the merger—which in turn required firing workers. If a company did not want to be taken over, the only way to do so was to get its stock price to rise, and this, too, required the firing of workers. In either case, the workers took the hit. But the CEOs running the merged ventures, often sweethearted into selling by generous gifts of stock, “usually made a fortune.” As Madrick notes, in 1986, Macy CEO Ed Finkelstein arranged for a private buyout of his firm, for $4.5 billion, and became the “envy of fellow CEOs” (174). Like many other mergers, however, this one drained what was one of America’s most successful retail operations, and Macy’s went bankrupt in 1992. Madrick concludes:

The allegiance of business management thus shifted from the long-term health of the corporations, their workers, and the communities they served, to Wall St. bankers who could make them personally rich... (173)


In the process, of course, the Wall Street bankers and leveraged buyout firms (LBOs) like Kohlberg Kravis Roberts who arranged the buys and the financing took in obscene amounts of money. So did risk abitrageurs (who invest in prospective mergers and acquisitions, angling to buy before the stock price rises on the rumor of a merger) like Ivan Boesky. Earning $100 million in one year alone (1986 when he was Wall Street’s highest earner), Boesky needed inside information to buy early, and got into the little habit of paying investment bankers for that information, i.e. on upcoming deals. Unfortunately for him, he got caught in his banner year because one of his informants (Dennis Levine of Drexel Burnham) was arrested and agreed to name those he had tipped off. Boesky was one (the deal was to pay Levine 5% of his profits for early information on a takeover), and he too was subpoenaed in the fall of 1986. Boesky immediately agreed to finger others (agreeing to wear a wire at meetings), and nailed Martin Siegel, also with Drexel, who, in turn, kept the daisy chain of ratting out associates going by naming Robert Freeman, an arbitrageur at Goldman Sachs. Nice fellows. Boesky ended up serving three years in prison, but he fingered an even bigger fish, Michael Milken. Then the wealthiest and most ruthless Wall Streeter of all, Milken, who made his money in junk bonds (risky high-interest bonds to ‘rescue’ companies in trouble) was sentenced to 10 years in jail (reduced to 2 years for good behavior) for securities violations, plus $1.3 billion in fines and restitution. He’d made so much money, though, that he and his family still had billions, including enough to start a nice foundation for economic research, to commemorate his good name in perpetuity.

There are, of course, lots of other admirable characters in this tale, but one in particular deserves mention, Jack Welch, the revered CEO of General Electric. This is because Welch’s reign at GE typifies what greed did to a once-great American institution, the very one that Ronald Reagan shilled for in a more innocent age, the one that brought the Gipper to the attention of the big money boys. Welch made enormous profits for GE (in 2000, the year he left, GE earnings had grown by 80 times to more than $5 billion), and himself, but he didn’t do it the “old fashioned way,” i.e. by developing new and better products. He did it by shifting the emphasis at GE from production to finance. Welch saw the value of this early:

“My gut told me that compared to the industrial operations I did know, the business (i.e. GE Capital) seemed an easy way to make money. You didn’t have to invest heavily in R&D, build factories, or bend metal…” (191)


To give an idea of how this works, Madrick points out that “in 1977, GE Capital…generated $67 million in revenue with only 7,000 employees, while appliances that year generated $100 million and required 47,000 workers” (191). Welch did the math. It didn’t take him long to sell GE’s traditional appliance business to Black & Decker, outraging most employees, though not many of them were left to protest: in his first two years, Welch laid off more than 70,000 workers, nearly 20% of his work force, and within five years, about 130,000 of GE’s 400,000 workers were gone. Fortune Magazine admiringly labeled him the “toughest boss in America.” And by the time he left the company in 2001, GE Capital Services had spread from North America to forty-eight countries, with assets of $370 billion, making GE the most highly valued company in America. The only problem was, with the lure of money and profits so great, GE Capital acquired a mortgage brokerage (Welch was no mean takeover artist himself) and got into subprime lending. In 2008, GE’s profits, mostly based on its financial dealings, sank like a stone, with its stock price dropping by 60%. Welch, the great prophet of American competition, now had to witness his company being bailed out by the Federal Deposit Insurance Company: since it owned a small federal bank, the FDIC guaranteed nearly $149 billion of GE’s debt. So after turning a U.S. industrial giant into a giant bank, the man Fortune Magazine named “manager of the century” also succeeded in turning it into a giant welfare case. Perhaps there’s a lesson here somewhere.

There’s more in this disturbing book—such as the fact that Wall Streeters not only attacked corporations in takeovers, they also attacked governments (George Soros’ hedge fund attacked the British pound, as well as Asian currency in 1999, causing crises in both places, and ultimately, cutbacks in government programs for the poor)—but the story is the same. During several decades of Wall Street financial predation, insider trading, and more financial chicanery than most of us can even dream of, the high-rolling banksters made off with trillions of dollars, and most others (including union pension funds) lost their shirts. Madrick quotes John Bogle, founder of Vanguard Funds, concerning the bust of the high-tech IPO bubble: “If the winners raked in some $2.275 Trillion, who lost all the money?...The losers, of course, were those who bought the stocks and who paid the intermediation fees…the great American public” (332). The same scenario was played out again and again, in derivatives trading, in the housing boom, in the mortgage-backed securities boom, in the false evaluations of stock analysts like Jack Grubman, in the predatory mergers and subprime shenanigans of Citibank CEO Sandy Weill, and on and on, all with an ethic perfectly expressed in an email, made public by the SEC, commenting on how ‘the biz’ was now run:

“Lure the people into the calm and then totally fuck ‘em” (334).

That’s essentially the story here. And the sad ending, which most of us haven’t really digested yet, is that the very vipers who cleverly and maliciously calculated each new heist and made off with all the money while destroying the economy, then got federal guarantees and loans that came to more than $12 trillion, that’s trillion, to “save the country.” And now lobby for “austerity” and “leaner government” and fewer “wasteful social programs” like social security and Medicare, and fewer regulations so that their delicate business minds can feel safe enough to invest again. And save us all again with their unfettered greed.

In which case, I’ll sure feel protected. Won’t you?

Lawrence DiStasi

Tuesday, September 27, 2011

Avatars and Immortality

Anyone who has read or heard even a little history knows that the dream of immortality has existed among humans for a very long time. Most of these dreams (though not all, as the Christian fundamentalist notions of the “rapture,” and Islamic fundamentalist notions of a heaven full of virgins awaiting the martyrs who blow themselves and others up, prove) have been debunked in recent years, when even the Roman Catholic Church has pretty much abandoned its notion of an afterlife in fire for those who’ve been ‘bad’ (whether Catholics still believe in a blissful Heaven for those who’ve been ‘good’ remains unclear to me).

What’s astonishing is that this dream of living forever now exists in the most unlikely of places—among computer geeks and nerds who mostly profess atheism. It exists, that is, in two places: virtual reality, and the transformation of humans into cyborgs (though cyborgs don’t specifically promise immortality, they do promise to transform humans into machines, which is a kind of immortality—see Pagan Kennedy, “The Cyborg in Us All,” NY Times, 9.14.11). If you can create an avatar—a virtual computerized model—of yourself (as has been done for Orville Redenbacher, so that, though dead, he still appears in his popcorn commercials), you can in some sense exist forever. The title of the avatar game on the internet, “Second Life,” reveals this implicitly. So does the reaction of volunteers whom Jeremy Bailenson studied for a Stanford experiment purporting to create avatars that could be preserved forever. When the subjects found out that the science to create immortal avatars of themselves didn’t yet exist, many screamed their outrage. They had invested infinite hope in being among the first avatar-based immortals.

Before dismissing this as foolish dreamery, consider how far this movement has already gone. Right now, the video games that most kids engage in (my grandson has a Wii version of Star Wars in which he ‘becomes’ Lego-warrior avatars who destroy everything in sight) “consume more hours per day than movies and print media combined” (Jeremy Bailenson and Jim Blascovich, Infinite Reality: Avatars, Eternal Life, New Worlds, and the Dawn of the Virtual Revolution, Morrow: 2011, p. 2) The key point about this, moreover, is that countless neuroscience experiments have proved that “the brain doesn’t much care if an experience is real or virtual.” Read that again. The brain doesn’t care whether an experience is “only virtual.” It reacts in much the same way as it does to “reality.”

Frankly, until I read Infinite Reality, all of this had pretty much passed me by. I had read about virtual-reality helmets such as the kind used to train pilots, but I had no idea that things had gone so far. I had no idea that millions of people sign up for the online site called “Second Life” (I tried; it seemed impossibly complex and stupid to me), and invest incredible amounts of time and emotional energy setting up an alternate personality (avatar) that can enter the website’s virtual world and interact in any way imaginable with other people’s avatars. Needless to say, most people equip their avatars with qualities they would like to have, or have wondered about having. Then they go looking for people (avatars) with whom to experiment in a wished-for interaction. The most common interaction, not surprisingly, seems to be sex with another avatar, or several others; but there’s also a lot of wheeling and dealing to gain wealth and prestige. Talk about “be all that you can be!”

Still, the really interesting stuff happens when you get into a virtual laboratory. Whereas “Second Life” takes place on a flat computer screen, virtual reality really comes into its own when you don a headset that can simulate real scenes in 3D fidelity so real that when people approach a simulated pit in front them, they invariably recoil (even though they’re “really” walking on a level floor). While virtual reality of this kind is expensive today, there can be little question that it soon will have become commonplace. Rather than spending tons of money traveling to China, say, one will be able to go there “virtually,” without having to endure the travails of travel, including bothersome other people. What makes this eerie is that video games are already working with this kind of VR, and creating avatars. In games like Pong, Wii, Move, and Kinect the game computer can already “track” a user’s physical movements and then “render” a world incorporating those movements into a virtual tennis scene that is authentic in all necessary details. So,
In a repetitive cycle, the user moves, the tracker detects that movement, and the rendering engine produces a digital representation of the world to reflect that movement…when a Wii tennis player swings her hand, the track wand detects the movement and the rendering engine draws a tennis swing. (p. 44)


As Bailenson notes, “in a state of the art system, this process (of tracking and rendering the appropriate scene from the point of view of the subject) repeats itself approximately 100 times a second.” Everything in the virtual scene appears smooth and natural, including, in the game “Grand Theft Auto,” an episode where players can “employ a prostitute and then kill her to get their money back.” And remember, the brain reacts to all this in the same way it does when it is “really” happening.

The implications to a psychologist like Bailenson are profound. Short people, for example, who adopt a tall avatar for themselves, show definite improvements in their self-image, even after they’ve left the avatar behind. They also show improvements in competition: in real games held afterwards, the person whose avatar was taller became a more successful negotiator. Those who fashion a trim, beautiful avatar, show the same rise in self-esteem. Bailenson also notes the importance of people’s attributions of “mind” or reality to inanimate objects like computers, and this includes avatars. In one experiment, subjects were shown a real person named Sally, and then her avatar disfigured with a birthmark (neurophysiological studies show that interacting with a “stigmatized other,” even someone with a birthmark, causes a threat response). After four or five minutes interacting with Sally’s disfigured avatar, subjects displayed the heart-rate response indicating threat—even though they knew the real Sally had no birthmark. And the games sold to consumers keep getting more sophisticated in this regard. In the Sony PlayStation game, THUG 2 (over 1 million sold in U.S.) players can upload their photos onto the face of a character, and then have their “clones” perform amazing feats of skateboarding, etc. They can also watch them performing actions not under their control. This brings up the question of the effect of watching one’s “doppelganger” (a character with one’s appearance) do something in virtual reality. It appears to be profound: the more similar a virtual character is to the person observing, the more likely the observer is to mimic that character. This can be positive: watching a healthy person who seems similar can lead a person to adopt healthy behavior. But other possibilities are legion. Baileson mentions the commercial ones:

…if a participant sees his avatar wearing a certain brand of clothing, he is more likely to recall and prefer that brand. In other words, if one observes his avatar as a product endorser (the ultimate form of targeted advertising), he is more likely to embrace the product. (119)


In short, we prefer what appears like us. Experiments showed that even subjects who knew their faces had been placed in a commercial, still expressed preference for the brand after the study ended. Can anyone imagine most corporations aren’t already planning for what could be a bonanza in narcissistic advertising?

More bizarre possibilities for avatars, according to Bailenson and Blascovich, seem endless. In the brave new world to come, “wearing an avatar will be like wearing contact lenses.” And these avatars will be capable of not only ‘seeing’ virtual objects and ‘feeling’ them (using ‘haptic’ devices), but of appearing to walk among us. More ominously, imposters can “perfectly re-create and control other people’s avatars” as has already happened with poor old Orville Redenbacher. Tracking devices—which can see and record every physical movement you make—make this not only possible, but inevitable. Everyone, with all physical essentials, will be archived.

All of this makes the idea of “the real world” rather problematic. Of course, neuroscience has already told us that the ‘world’ we see and believe in is really a model constructed by our brains, but still, this takes things several steps beyond that. For if, in virtual reality, “anybody can interact with anybody else in the world, positively or negatively,” then what does it mean to talk about “real” experience? If “everything everybody does will be archived,” what does privacy mean?

At the least, one can say this: a brave new world is already upon us (think of all those kids with video games; think of how much time you already spend staring at your computer screen), and you can bet that those with an eye to profiting from it are already busy, busy, busy. One can also say, take a walk in the real outdoors with real dirt, grass, trees, worms, bugs, and the sweet smell of horseshit; it may soon be only a distant memory.

Lawrence DiStasi

Friday, September 9, 2011

The Spirit of Capitalism

I have been reading Max Weber’s seminal work, The Protestant Ethic and the Spirit of Capitalism lately and it illuminates a great deal about the spirit of our times—a spirit that has been termed The Age of Greed by Jeff Madrick in his recent book of that name. And while what Madrick describes is really the transformation in the last 40 years of America from an industrialized society to a financialized one, it doesn’t address the origins that interest me here. Weber was interested in this too. His question really was not only ‘why do people work to begin with’ (primary cultures had no concept called “work” at all and only exerted themselves periodically in war or in short-term hunting and gathering), but more relevant to his time, ‘why do people in modern society identify themselves as laborers?’ How was it possible for western culture to transform itself from a traditional culture where labor hardly existed except as part of a manorial household, to post-1600s capitalist society where free laborers are yoked to paying jobs in capitalistic enterprises? More specifically, how could a state of mind that Weber finds best illustrated in Ben Franklin (a penny saved is a penny earned; time is money; credit is money—i.e. it is a duty to increase one’s capital) come to be adopted by whole societies when, in the Middle Ages and before, that state of mind would “have been proscribed as the lowest sort of avarice?” As sinful greed? To illustrate how remarkable this is, Weber compares traditional laborers with modern laborers. A farm owner, for example, who pays his workers at a piece-rate (like modern farm workers paid at so much per bushel), thinks to increase production by increasing rates. This works with modern workers, but when applied to traditional laborers, the increased rate backfires. The traditional worker, that is, not only does not increase his work rate, he decreases it—he works slower so as to still earn the same daily amount. As Weber summarizes it, “the opportunity of earning more was less attractive than that of working less.” Thus the attitude of traditionalism:
A man does not “by nature” wish to earn more and more money, but simply to live as he is accustomed to live and to earn as much as is necessary for that purpose. (60)

Weber then devotes his entire book to explaining how Protestantism, especially the Calvinist branch of the Reformation, changed this traditionalist attitude towards work. While a “surplus population which it can hire cheaply” is necessary for capitalism to develop and thrive, so, he says, is a “developed sense of responsibility.” That is, for capitalism to work, “labour must…be performed as if it were an absolute end in itself, a calling.” Far from being natural, or even the product of high or low wages, this attitude “can only be the product of a long and arduous process of education” (62). And the educating body was, originally at least, Protestantism. It is important to note that this education in work did not, at least at first, involve an education in greed, much less enjoyment. To the contrary, Weber makes clear that the essential ingredient, in the beginning, involved a kind of asceticism—not the asceticism of the monastery, but an asceticism in the world. To make labor a calling, that is, meant making labor an obligation in the service of God, of salvation. One was schooled in the idea that hard and constant work was an end in itself, the way of salvation for the average person, and that saving the money one earned was part of that obligation. In order to save, of course, one had to be frugal, buying only what was absolutely necessary. The asceticism that had been the mark of the otherworldly Catholic monastery, that is, was brought into the world. So one worked, one saved (“a penny saved is a penny earned”) and one eventually prospered. It is a commonplace that in the American colonies during the Puritan period (Boston, etc.), these essential elements were merged in such a way that prospering in business became synonymous with salvation—or rather, prospering became a sign of salvation. This is because though election (salvation) or damnation was pre-determined by God, the actual judgment was uncertain, and this uncertainty was almost intolerable. One’s prosperity thus became a sign, a way for the uncertainty to be resolved. The opposite was also true: poverty became a sign of damnation, making the poor doubly damned—both in this world and the next. The sad truth is that many Americans still maintain these essential attitudes.

Work as a calling then, work as a duty, and success in work as a sign of salvation are the essential elements of the Protestant ethic. They are also the essential elements of the spirit of capitalism. As Weber puts it,
the expansion of modern capitalism is not in the first instance a question of the origin of the capital sums which were available…but, above all, of the development of the spirit of capitalism (68).


This is not to say that Protestantism ignored the dangers of wealth. Weber cites the writings of Richard Baxter as illustrative. And there, the key to this danger involved idleness and the temptations of the flesh it exposed one to. As Weber interprets Baxter, “Waste of time is thus the first and in principle the deadliest of sins….Loss of time through sociability, idle talk, luxury, more sleep than is necessary for health..is worthy of absolute moral condemnation” (157). A person was thus led to work constantly, to save what he earned, never to enjoy the fruits of his labor, but rather invest those savings as an entrepreneur in new opportunities for more work (and wealth). So while the ethic frowned on wealth and the luxuries it fostered, it at the same time had the “psychological effect of freeing the acquisition of goods from the inhibitions of the traditionalist ethic. It broke the bonds of the impulse of acquisition in that it not only legalized it, but looked upon it as directly willed by God” (171).

Weber ends his work with the ironic contradiction involved in this religiously inspired ethic. He quotes John Wesley, the co-founder of Methodism, as follows:
“I fear, wherever riches have increased, the essence of religion has decreased in the same proportion. Therefore I do not see how it is possible, in the nature of things, for any revival of true religion to continue long. For religion must necessarily produce both industry and frugality, and these cannot but produce riches. But as riches increase, so will pride, anger, and love of the world in all its branches….So, although the form of religion remains, the spirit is swiftly vanishing away.” (175)


The protestant entrepreneur, in this way, not only won the “feeling of God’s grace” for doing his duty in getting rich, but also a supply of “sober, conscientious, and unusually industrious workmen, who clung to their work as to a life purpose willed by God.” This ethic comforted the capitalist entrepreneur as well that the “unequal distribution of the goods of this world was a special dispensation of Divine Providence.” For had not Calvin himself said that ‘only when people, i.e. the mass of laborers and craftsmen, were poor did they remain obedient to God?’ (177). He had. So low wages, themselves, had been rationalized and justified by the divine.

The Protestant ethic, in sum, according to Weber, not only sanctified labor as a calling enabling a worker to be certain of his election, it also legalized, for the capitalist, the “exploitation of this specific willingness to work.” A Daily Double if there ever was one.

It takes little to see how these attitudes and rationalizations are still in use today. America sanctifies capitalism as literally the manifestation of both God’s will and the natural order of things. American media also lionizes those entrepreneurs who, at least according to their own myth, raise themselves by their own bootstraps to become rich—to become “elect” in modern society’s terms. Finally, American capitalism rationalizes the unequal distribution of wealth and goods in this world as simply the workings of natural or divine laws with which mere humans cannot quarrel.

To Max Weber’s credit, he ends his study with a scathing reminder that though this ethic began in the cloak of saintliness, its apotheosis in industrial capitalism became “an iron cage.” Had he known about capitalism’s most recent metamorphosis into an ongoing financial heist creating ever more inequality, his critique would have been far more savage.

Lawrence DiStasi

Tuesday, August 23, 2011

Decision Fatigue, Anyone?

Among the several enlightening articles around last weekend, one stood out for me: John Tierney’s 8/17 NY Times piece on Decision Fatigue. It’s something everyone feels, but few of us understand that it’s a real syndrome, with roots in brain chemistry. That means that it’s not just some anecdotal phenomenon of people who complain, after shopping till dropping, that they’re exhausted—although that’s probably the most common experience for most of us. It’s far more general than that, and, apparently, far more universal (I always thought it was just me who hated shopping at whatever time of the year.) What this means is that the brain actually gets depleted of energy when it has to make lots of decisions—whether or not to eat another donut; whether or not to go online for a few more minutes; whether or not, as a judge, to grant parole to an inmate before you.

According to Tierney, the latter situation was a key one examined recently. In a report this year, two researchers looked into the decisions judges make, in an effort to account for why they rendered different judgments for defendants with identical records. After looking at the usual suspects (racism, other biases), they started to zero in on the time of day the judges made their decisions, and found that judges who made their rulings early in the day were far more likely to grant parole than those who saw a defendant late in the day. Looking even more closely, they found that if you were unlucky enough to appear before a judge just before the noon break, or just before closing time, you would likely have your parole plea rejected; if you saw the judge at the beginning of the day, or right after lunch, you were more likely to get your parole granted. The cause: decision fatigue. As the researchers noted, “the mental work of ruling on case after case, whatever their individual merits, wore them down.”

What this and other experiments have demonstrated is that each of us possesses “a finite store of mental energy for exerting self-control.” And self-control requires that old bugaboo “will power”—a form of mental energy that can, and often is, exhausted. If you’ve spent your day resisting desire—whether it’s a yen for a cigarette, a candy bar, or a trip onto the internet—you’re less capable of resisting other temptations. Nor is this just a curious finding. What researchers argue is that this kind of decision fatigue is “a major—and hitherto ignored—factor in trapping people in poverty.” People who are poor, that is, constantly have to make that hardest of decisions, the trade-off (can I afford this? can I afford that? Should I pay the gas bill or buy good food?), and such decisions sap their energies for other efforts like school, work or improving their job prospects. This is confirmed by images that have long been used to condemn the poor for their failure of effort: welfare mothers buying junk food, or indulging in snacks while shopping. Far from being a condemnation of “weak character,” however, such activities often indicate decision fatigue, which the poor experience more than the rich because of the increased number of trade-offs their lives require, and hence the decreased willpower left them to resist impulse buying.

The big surprise in this research, though, comes with the brain studies. Everyone knows that the brain is a great consumer of sugar, or glucose, for energy. But what no one had expected was the specific connection between glucose supply and willpower. In a series of experiments, researchers tested this by refueling the brains of some subjects performing tasks with sugary lemonade (glucose), and some with lemonade sweetened with diet sweetener (no glucose.) The results were clear: those who got the glucose found their willpower restored, and thus their ability to exercise self-control augmented. They made better choices, and even when asked to make financial decisions, they focused on long-term strategy rather than opting for a quick payoff. In short, more mental energy allowed them to persist in whatever task was at hand. Even more to the point, the researchers found that the effect of glucose was specific to certain areas of the brain. As Tierney puts it:

Your brain does not stop working when glucose is low. It stops doing some things and starts doing others. It responds more strongly to immediate rewards, and pays less attention to long-term prospects.


This is critical information, especially in our choice-and-distraction-filled culture. Yet another study in Germany, where subjects were monitored by frequently reporting their activity via their Blackberries, concluded that people at work spend as much as 4 hours a day “resisting desire.” The most common of these desires were “urges to eat and sleep, followed by the urge for leisure” (i.e. taking a break by playing a computer game, etc.). Sexual urges were next on the list, slightly higher than checking Facebook or email. The most popular general type of desire was to find a distraction—and of course, the workplace of large numbers of people these days centers on the computer, that click-of-the-mouse distraction machine.

And the trouble with all this is that willpower depletion doesn’t manifest with a specific symptom, like a runny nose, or a pain in the gut. As Tierney says:

Ego depletion manifests itself not as one feeling but rather as a propensity to experience everything more intensely. When the brain’s regulatory powers weaken, frustrations seem more irritating than usual. Impulses to eat, drink, spend and say stupid things feel more powerful.


Perhaps this is why tired politicians so often say, and do stupid things. Not to mention the howlers of our physicians, our generals, our corporate execs, and our media pundits. Perhaps, too, it explains why Ronald Reagan always kept a jar of jellybeans on his desk—though it is true that his decision-making stemmed from a malady of a different sort.

In any case, the lesson from all this might be: take breaks. Eat candy (or better still, protein). And don’t make important decisions when you’re exhausted (like responding to that nasty email). Most decisions can wait, and will profit from a glucose-rich, rather than a glucose-depleted brain.

Lawrence DiStasi

Wednesday, August 17, 2011

Growing the Economy

I’ve just finished watching the PBS News Hour (8/16), with a major segment on the problems with Europe’s faltering economy. Sheherezade Rehman, an economics professor from George Washington University, stated the conventional wisdom:
“The real long term issue is growth,” she said. “Without growth, there is no way out of this crisis.”

Barack Obama has been saying the same thing, adding that the growth in jobs can’t come from the government; it has to come from business. And everyone nods, and agrees, and reinforces the point in whatever way possible: our system, corporate capitalism, requires growth. Constant and incessant growth. If growth stops, our economy is in trouble, capitalism is in trouble, capitalism dies.

But wait. What kind of system can sustain constant growth? Is there anything in nature that’s like that? Don’t most systems have limits, lemmings for instance, which grow and grow and then hit a certain point of overpopulation that strains the resources lemmings need, and then they stampede off cliffs in a kind of mass suicide? Or don’t natural systems all have predators which, when a population grows too fast, fatten up on the growing population, thereby cutting it back to a sustainable or balanced level? They do. But not humans; not capitalism. It’s as if capitalism is a kind of cancer: it has to grow and keep growing without cease, without limit, no matter what. Until, that is, the host organism dies.

Many thinkers have noted the logic in this. The constant growth demanded by capitalism drives us to the brink of disaster. With the human population out of control (only in the last few hundred years) we humans are drowning in our own waste, we are destroying the resources (like fossil fuels or rare minerals) needed to keep growing, our oceans and rain forests are being devastated, our water sources are being polluted or emptied. And then, there’s the carbon we’ve poured into our atmosphere, producing a little tent for ourselves that traps heat, leading eventually to a deadly rise in the temperature of the planet. Growth, in a word, is fatal to our planet. One of my neighbors has a bumper sticker that sums this up:
Growing the economy is Shrinking the ecology.

The people who are urging more growth, or lamenting the lack of enough of it, then, are either ignorant or insane. They are ignoring the fact that our only salvation, not as individual communities or nations, but as a species, as a planet, is to stabilize growth. To reduce growth to a sustainable level. And one way to do that, perhaps, is to distribute the wealth we already have more equitably. Since that is unlikely to happen, however, (and this is another subject that’s been getting increasing attention: the astonishing inequality of wealth in the United States, with the top 1% controlling upwards of 90%, and the bottom 40% controlling essentially ZERO—see the same PBS News Hour, 8/16/11), I see only one solution (aside from a mass revolt aimed at killing the obscenely wealthy among us, that is.)

Indeed, in some ways, I think that solution to reduce or stop growth is already under way. People, consumers are not doing this intentionally, of course. They are limiting their buying to absolute necessities because that’s all they can afford: a bit of food, a bit of transportation to jobs, a bit of clothing to cover their asses. And that is precisely what we need. In fact, though, we need even more radical action than that. We need a mass movement, an intentional movement, of outright refusal.

We refuse to take part in this global insanity. We refuse to buy more gadgets we don’t need, faster computers we never asked for, bigger cars we can’t afford to fuel, more elaborate toys to keep our minds off the real issues of life, including our enslavement. We refuse it all. We are boycotting the idiocy of modern capitalism, of planned obsolescence, of disposable diapers and clothes and shoes and packaging for our gadgets and toys and foods, of more and more automation that eliminates jobs for real people so corporate pigs can make more profits, of more and more hospital procedures that extend life beyond the point where it is bearable so hospitals and doctors can get rich. We refuse. We see through the brainwashing (buying beyond what we need is a fairly modern ploy of U.S. capitalism, designed by a few market researchers in the 1920s alarmed at the prospect that mass production had become so efficient that people would soon satisfy all their needs and corporations would drown in un-bought products, and who therefore invented campaigns to stimulate endless psychological desires for useless crap (see the documentary The Century of the Self, by Adam Curtis), we know what brainwashing portends and who profits from it, and we have had enough.

This is what millions need to say directly to corporations, especially those who have moved their “offices” to foreign countries to avoid U.S. taxes. CBS’ 60 Minutes had a segment on this issue Sunday night, and it had me screaming at my TV set where the CEO of Cisco Systems not only admitted that Cisco had moved its corporate headquarters and thousands of jobs to Ireland, but glorified it into a threat that either the United States lower its tax rate to parity with countries like Ireland and Switzerland, or face losing even more corporate taxes. In essence, he and his ilk are holding a gun to our heads, saying if you don’t lower our taxes (already at the lowest point in almost a century), we’ll all be leaving.

If there is a crime called 'treason', this is it.

And my answer, and every refusing American’s answer should be: GO. Take your shit headquarters to whatever country you like, take your shit products with you, and don’t bother coming back. And don’t bother trying to sell your products here either because we will slap a boycott or a tariff upon you and all you produce.

There’s a lot more that I’m thinking, of course, such as the fact, brought up on Yahoo News today, that Pay Pal founder Peter Thiel, one of our wonderful billionaires so incensed about high taxes, has been revealed as a “big backer of the Seasteading Institute.” Guess what this utopian innovation, to which Thiel has recently donated $1.25 million, proposes to do? Well, given that even all their bought-off lackies in Congress may not be enough to protect these money hogs from government intrusion, Seasteading seeks to “build sovereign nations on oil rig-like platforms to occupy waters beyond the reach of law-of-the-sea treaties.” It’s the ultimate libertarian wet dream. Set up safe havens in international waters
free from the laws, regulations, and moral codes of any existing place. Details say the experiment would be "a kind of floating petri dish for implementing policies that libertarians, stymied by indifference at the voting booths, have been unable to advance: no welfare, looser building codes, no minimum wage, and few restrictions on weapons."


Nor is this some pie-in-the-sky venture. One rep from the Institute is quoted as saying the group actually plans to launch an “office park off the San Francisco coast next year.” (To see that this isn’t an urban myth, check out http://seasteading.org/ )
And these are our sunshine patriots, tearfully singing America the Beautiful.

So think about it next time you are inclined to buy something. Do you really need it? Can you do without it? Patriotism--not national patriotism but planetary patriotism--demands a short, simple response: I prefer not to. (for more on this, see my “Bartleby Option” at www.lawrencedistasi.com.)

Lawrence DiStasi

Tuesday, August 2, 2011

Waka in Bolinas

I was just finishing morning meditation when my neighbor Walter knocked on the door. Toting two cameras, he said, “Come on. You have to see this.” I asked what, as I was getting my shoes on, thinking it might be a large yacht or perhaps some monster pieces of the Bay Bridge nearing completion (we saw some heading to the Golden Gate a couple years ago, shipped in from China where they’d been fabricated). He said nope, and we walked across the field to the cliff overlooking Bolinas Bay. And there lined up was a fleet of seven boats, with these strange double sails that looked a bit like felucca sails on the old fishing boats Italians used to fish San Francisco bay with. Walter had his high-powered binoculars on a tripod so I was able to get a pretty good look. We could see crews on board each strangely marked boat, plus a white yacht accompanying them. At least one Bolinas fishing boat motored out to talk to them and, I found out later, bring them ice cream. Then it became apparent that these were catamarans, double canoes of traditional Maori design, their red sails and prows decorated with fantastic Maori art forms.

Walter explained what he knew. The boats were from New Zealand, and they were on a Pacific voyage to try to draw attention, via traditional sailing craft, to the plight of the oceans and the related plight of many Pacific Islanders threatened by global warming. They were sailing to San Francisco today, to take part in a World Oceans conference for a week. Then they’d head back to New Zealand, probably stopping again at Hawaii where they’d already been, and other Pacific islands from which their crews had come. That they’d decided to stop in Bolinas for the night added a bit of local pride to the visual thrill. Walter had actually seen them yesterday when fishing for salmon (some of which he gave me) out beyond Point Reyes.

I did a little search on the web and eventually found several accounts of what now seemed an almost magical voyage. What I learned was that the Waka (or vaka; the name of their boats) voyagers had left New Zealand on April 13, after traditional ceremonies, and headed for several other islands as well as Hawaii and the U.S. mainland. According to Hoturoa Kerr, chief of the Haunui waka:
“We’ve got people here whose islands have been covered by rising water levels and their fishing grounds are no longer as abundant. We’re trying to raise awareness to people who live thousands of miles away that what they do affects ordinary people who are, in some cases, subsistence living.” (Otago Daily Times, April 13, 2011).

Kerr also pointed out the “eco friendly” nature of the boats, constructed of carved wood and twine, and which use sail power supplemented by solar-powered motors for harbor navigation. Their food would consist of a great deal (they hoped) of caught fish, plus canned and dried goods plus locally-produced organic food. They intended, said Dieter Paulmann, filming the voyage for a documentary, to “map their way in the wake of their ancestors, using the stars, sun, wind, and wildlife as their guides.” The plan was to reach Hawaii by early June where the crews would attend the Kava Bowl Summit 2011 for discussions with scientists, media, political and corporate leaders to create ways to move toward the sustainable use of the ocean’s resources. Then it was on to San Francisco for another ocean conference on Treasure Island, with the return via numerous other Pacific islands, and the 11th Pacific Arts Festival on the Solomon Islands in 2012.

By the time we saw them in Bolinas, the wakas had already labored through the Pacific gyre where a revolting “continent” of plastic debris has taken up residence (see my June 6, 2008, blog, “Plastics etc.”). They had been through storms and food deprivation, as well as space deprivation (16 crew members on each 22-foot craft don’t have much space). But they were clear about their mission. The voyage was a kind of dress rehearsal for what humans were going to have to do on a larger scale:
adapt to dwindling natural resources. Here are a few entries from their website--http://www.pacificvoyagers.org/-- blogs:

After a few minutes of deep breathing and relaxing my mind I got the image of a whale’s tail in my head.
It slapped the water.
“Listen.
Listen to the breathing of the tides and know that all the world beats with one heart, breathes with one breath.
I started getting distracted by the music and noise behind me.
Slap, slap, slap.
“Listen, listen, listen!
You (humanity, the waka crews, us as individuals) have a special place.
You are the Key….
We do have a special place. It is by our hand that the world and the creatures in it will live or die….
We’ve been becalmed for two days and for two days we’ve been surrounded by all the Life in the sea. Well, seals, dolphins and whales, lots of them. Last night the seals were coming in like aquabats (acrobats with a speech impediment) zigzagging in formation thru the phosphorescence leaving trails of stars behind them. They’ve scared the girls by popping up beside the canoe and barking loudly. They’ve entertained us with their showing off, leaping and jumping, one even going so far as climbing on board the Samoan canoe and sitting on the bow doing nothing and barking at the captain whenever he talked to it (just like the rest of the crew so he tells me)….
And if we hadn’t been halted by the wind we would’ve missed it all. We would’ve zoomed on thru as we do for most of lives, distracted by the music and the white noise of the modern world and really just missing the point.
“Listen, listen, listen!”

and another:
We were told today in an email (thank you Shantparv) that our entry has coincided perfectly with a phase described in the Mayan calendar wherein a great unfolding of consciousness is seen for the first time. I’m all for that! I think a great unfolding of consciousness is somewhat overdue. I think our consciousness has been folded for a very long time. It’s time to take out the creases and realise we’re all on the same page! The healing of the planet will by necessity include the healing of ourselves. We are all essentially the same. We experience all the same stuff in human terms. The words we use to describe Life, the Universe and Everything don’t really matter.

Some lovely photos and quite a few videos are available for viewing on the Pacific Voyage website (http://pacificvoyagers.org/). That’s probably the best way to get a sense of the visual impact these boats and their crews make. Indeed, it’s something no one should miss—though I have to admit that seeing them off my own home coast gave it added juice for me. But you can also “join” they voyage by following along online and sending messages to the crew; they truly appreciate the support and the knowledge that someone is paying attention. And hadn’t we all better do that? The time for action is getting very very short.



Lawrence DiStasi
Image 9/23 Uto Ni Yalo © Tanja Winkler

Sunday, July 31, 2011

The Real GOP Agenda

In the crisis over approving a rise in the debt ceiling which they themselves created, the Republican Party has reached a new high in hypocrisy and cruelty, not to mention madness. But since calling someone “insane” tends to let him or her off the hook (see Anders Breivik and his 1500-page manifesto), I’ll table that as well as the hypocrisy, and get to the point. The GOP has a clear agenda—to cut government to the bone—and that is what needs to be examined. This is because it is obvious that the GOP doesn’t seem to have any problem with government handouts—so long as the handouts go to banks/Wall Street execs, military contractors, weapons manufacturers, oil companies, big Pharma and giant agribusinesses. They also don’t seem to have any axe to grind when it comes to government privileges for their favorite fundamentalist churches and anti-science wackadoos. What they have set their sights on are “entitlement” programs for the poor, the elderly, the underprivileged: you know, Social Security, Medicare and Medicaid. Nevermind that Social Security is paid for over a lifetime of labor by those who get it (providing, by the way, a handy fund for the politicians themselves to “borrow” from whenever they need a little extra cash for a war they refuse to pay for). Nevermind that without Medicare, millions would be deprived of the minimal care that can’t even approach the luxurious health plan the pols have crafted for themselves. The most visible targets are government agencies like the EPA, the Internal Revenue Service, the National Parks, the Department of Education, Head Start, the Equal Employment Opportunity Commission, OSHA etc; and at the local level, the public schools, the public colleges, the parole and prison departments, and all departments that service, again, the poor, the unprivileged young, the elderly, the handicapped. What they can’t get rid of, they will try to privatize—again for the benefit of their corporate funders.

Consider a few lines from a piece by Nicholas Krisof (“Republicans, Zealots and Our Security”) in last Sunday’s NY Times. He first quotes Congresswoman Rosa Di Lauro:
“The attack on literacy programs reflects a broader assault on education programs,” said Rosa DeLauro, a Democratic member of Congress from Connecticut. She notes that Republicans want to cut everything from early childhood programs to Pell grants for college students. Republican proposals have singled out some 43 education programs for elimination, but it’s not seen as equally essential to end tax loopholes on hedge fund managers.
So let’s remember not only the national security risks posed by Iran and Al Qaeda. Let’s also focus on the risks, however unintentional, from domestic zealots.

What struck me in this last line was Kristof’s qualifier: “however unintentional.” The truth is, the GOP’s rampage against “bloated government” is both quite rational and viciously intentional (again, Anders Breivik comes to mind). It is to gut every program put together by Democrats from FDR’s Depression programs to LBJ’s Great Society, programs to provide not just a minimal safety net for the least fortunate Americans, but opportunity for all those who have been able, finally, to gain a tentative purchase on a decent life by finding government jobs at various levels. This is the whole point of the GOP’s coordinated campaign to destroy unions, “cut waste from government,” and cut taxes. It’s about eliminating the revenue source for those government jobs. It’s about cutting off the voter base—mostly Democrat—that those government jobs represent. And at its core, it’s about putting back in their place—at the lowest levels of society—all those “uppity” minorities who, through government equal-employment mandates, have ‘risen above their station.’ This includes blacks, Hispanics, and women, as well as the lefties and liberals who have long argued for the inclusion of such minorities in America’s prosperity.

It is this that most grates on the Republican zealots—now concentrated more than ever in the South and the West/Midwest. They hate the fact that teachers have tenure and all those “luxurious” pension plans. They hate the perceived “laziness” of government workers with “cushy” jobs and pension plans. Though it may bring them down as well (and it is doing just that; the massive loss of government jobs in the wake of the 2008 collapse is responsible for a large part of the high and persistent unemployment rate), they are willing to sacrifice their own well-being in order to appease their toxic resentment. This resentment is clear in the symbolic language (“to cut government in half in 25 years, to get it down to the size where we can drown it in the bathtub”) used by the guru of this movement, Grover Norquist. Drown it in the bathtub? What is the size Norquist means here? Baby size? An infant drowned in a bathtub? Who would use such imagery? A privatizing Republican zealot, that’s who. An artist of propaganda, of revenge, of cruelty. The heir to those massive crowds in the pre- and post-Civil War South who could relish the spectacle of torturing, burning, hanging a human being who had dared to transgress their sacred code of two worlds that could never, ever meet.

And with a black man in the White House, that resentment has only festered and grown more virulent, more ugly. Of course not even the most conservative of GOP leaders can come right out and give this voice. So they use the symbolic language of cutting taxes to cut government spending.

But don’t be fooled. If you’re wondering why the GOP is hell-bent on destroying government (even as Republicans and their corporate masters suck from the tit of that same government, and display a fierce determination to re-take its most visible power source), you can start with two simple but toxic causes: racism and resentment. Then add the infinite greed and casual cruelty of the elite (the top 1% of Americans now control more wealth than the bottom 90% combined; median wealth of Anglo households is now 20 times that of black households—see July 27 Pew Research report), and you’re pretty much there.

Lawrence DiStasi