Friday, October 21, 2016

Slavery in New England

It is very difficult for most conventionally-educated Americans to think of “slavery” and “New England” in the same sentence. Slavery, as commonly understood, happened in the American South where it was intimately tied to plantation-growing of cotton, or tobacco; or in the Caribbean islands where it was ‘required’ on sugar plantations. But as Wendy Warren points out in her new book, New England Bound: Slavery and Colonization in Early America (Liveright: 2016), the conventional view is wrong, or at least incomplete. Slavery did exist quite openly in the Boston colonies and those settlements that grew up around them in Rhode Island, Connecticut, and beyond. In fact, the iconic story of Squanto, the helpful Indian whose aid to the Plymouth colonists in their early years literally saved them—that story begins with enslavement. For how was it that a native American was able to communicate with the Mayflower voyagers in English? Warren tells us: Squanto was kidnapped in 1614 by Thomas Hunt, a ship captain associated with Captain John Smith (the one who romanced Pocohontas). From there, he was shipped to Malaga in Spain as a slave, released from captivity by Spanish priests interested in teaching him the ‘true faith,’ and managed to get to England. Then he was able to sign up for a voyage to the new world with the Newfoundland Company, and eventually made his way south to seek his family in New England. Unfortunately, most were dead of smallpox by then, but having been ‘saved’ by his captivity, Squanto lived to ‘save’ the colonists in Plymouth. 
            From these inauspicious beginnings, New England slavery proceeded almost unhindered by either moral scruples or economics. To begin with the morals, not even the Puritans disavowed slavery: the 1641 Body of Liberties document of the Massachusetts Bay Colony made this specific by outlawing both slavery and captivitie “unless it be lawfull captives, taken in just wars, and such strangers as willingly sell themselves, or are solde to us” (35). In other words, slavery was forbidden for colonists, but for both the Native Americans captured in wars, and the Africans imported from Africa, enslavement was morally justifiable. The way this worked in practice, especially in a land and climate where plantations were essentially impossible, had to do with the Atlantic trade in which New Englanders played a critical role. As Warren puts it: “Slavery bridged the ocean between New England and the West Indies” (51). What this means is that though the slave plantations were located far away in the West Indian sugar colonies like Jamaica and Barbados, those plantations provided one of the chief means of profit for New England’s merchants, farmers and fishermen. The Atlantic world upon which the colonists (and indeed the entire economy of England—see my July 3, 2014 blog, “Slavery’s Role in Modern Capitalism”) depended, that is, was one of slavery: “slaves, the output of slaves, the inputs of slave societies, and the goods and services purchased with the earnings of slave products” (53). How did it work? In three main ways: New England merchants traded in slaves themselves, plus other slave-related goods like rum; New England farmers provided their surplus farm products to the slave colonies via those merchants; and New England fishermen profited by selling fish to the plantation owners to feed their slaves. This latter was highly profitable because rotten fish that New England fishermen couldn’t sell to English colonists could be sold to plantation owners to feed their slaves. Even at reduced prices, that is, the rotten fish that slaves had to eat still represented a gain to fishermen (and fish merchants) who would otherwise suffer a total loss. 
            If it were just by proxy that New Englanders involved themselves in the slave trade, that would be bad enough, for slavery in the West Indies was one of the most vicious and “deadly innovations known to humanity” (52). But New Englanders involved themselves in slavery in many other ways. One was by selling Indians captured in wars such as King Philip’s War and the Pequot War into West Indian slavery. This was why Native Americans did all they could to avoid being sold into West Indian slavery: it was literally a death sentence. But few were able to avoid it (except by dying), for the wars to exterminate the Indians were bitter and total and proceeded from the very beginning of English settlement (Plymouth itself was a military garrison). Indeed, this gets to an important distinction between two kinds of colonialism: extractive colonialism and settler colonialism. In extractive colonialism, native peoples are kept alive and their labor used, or rather exploited to ‘extract’ the riches of the colony. The Americas, however, belonged to the other type, settler colonialism—where the “primary goal is to attain control of the territory” (90). And the corollary to settler colonialism is that the indigenous people must be removed. The Atlantic slave trade thereby became one part of the solution to the problem of removal. That is,  

it offered the English a way to remove Indians from the region [making a profit in the bargain], and then replace them with bodies [i.e. indentured servants or African slaves] who would want to work the land in a way meaningful to the colonizers (91).

Thus did the English colonists become active participants in slavery, first selling Native Americans into slavery to get rid of them, and then importing Africans as slaves to replace them. War was involved in both ends of the bargain: “Indian slaves in New England were, mostly, sold because they had been captured in wars, whereas African slaves were captured in wars so that they might be sold” (91). This seemed quite ethical to colonial New Englanders, for it was foreseen early on by Emmanuel Downing, John Winthrop’s brother-in-law who wrote to Winthrop in 1645:

If upon a Just warre the lord should deliver [the Indians] into our hands, wee might easily have men woemen and Children enough to exchange for Moores [Africans], which wilbe more gaynefull pilladge for us then wee conceive, for I doe not see how wee can thrive until wee get into a stock of slaves suffitient to doe all our business… (95).

            Thus it was that slavery became a commonplace among those who could afford slaves in the early colonies of New England. Wendy Warren illustrates this by summarizing countless legal cases where slaves were involved in lawsuits, or wills, or criminal proceedings. One or two examples will suffice. An early will in 1678 in Milford, CT specifies the giving of “two of my negroes, a man and a woman, to my son George Clarke, which he shall chuse,” and a negro each to “my son Thomas Clark” and “to my daughter Sarah” (127). Slaves were also punished for “fornication,” either among themselves or with white indentured servants, the catch being that slaves were denied the right to marriage and family so any babies produced would be proof of illegality. A complex case in the 1660s involving three generations of slaves named Warro illustrates this. A prominent colonist named Daniel Gookin brought two of his slaves, Jacob and Maria Warro, to New England from Maryland, with two Warro children, Sylvanus and Daniel. First Daniel got in trouble fathering a child with a slave woman named Hagar. Then his brother Sylvanus, having been sold (or rented) to a William Parke of Roxbury, committed “fornication with Elizabeth Parker” an English servant of Parke’s, and was sent to prison for it. Just after his release, however, Sylvanus Warro was arrested and convicted of stealing from Parke, and sentenced to pay twenty pounds and whipped twenty lashes. Though he could endure the whipping, Warro could never pay such an amount of money, but the court then ordered him to pay weekly child support for the child he’d fathered with Elizabeth Parker, adding to his impossible burden. The court then said he could acquit the debt by being sold as a slave for the equivalent value. Meantime, Parker herself returned to England, and the child, Sylvanus Jr., was ordered, when old enough, to be put out to service for thirty years to compensate his master Parke for expenses he’d incurred in raising him. Meantime, Sylvanus Warro himself was sold by his original master, Gookin, to Jonathan Wade to serve him for the rest of his life. In the end, his son, Sylvanus Jr., served out his thirty years of service, and was brought before authorities in Boston for being a kind of vagrant: “a Lame Cripple.” The Boston authorities had no interest in supporting him, ailment or not, and ordered him to “depart out of this town” (175). In short, three generations of an entire family were exploited, punished, jailed, sold and exiled at the whim of owners or authorities with hardly a thought.
            At last, in 1700, the diarist and Salem-witch-trial judge, Samuel Sewall, wrote The Selling of Joseph, the first writing from New England to call for the abolition of slavery. Insisting that all slaves were “the Offspring of GOD, and their Liberty…more precious than gold,” Sewall’s three-page pamphlet set off a storm of rebuttals from other colonists. For even Cotton Mather, New England’s iconic Puritan minister, had written that nature itself insisted that “there must be some who are to Command, and there must be some who are to Obey.” Mather also wrote Rules for the Society of Negroes, his attempt to keep Africans from making trouble by having them agree that “if any of them should, at unfit Hours, be Abroad, much more, if any of them should Run away from their Masters, we [i.e., we Negroes] will afford them no Shelter: But we will do what in us lies, that they may be discovered and punished” (232). At all events, Sewall was far ahead of his time, and with the news of slave revolts in Jamaica, there was little chance that his call for abolition would be heeded in 1700. But though it would take almost a century, slavery was finally ended in Massachusetts in 1783 when the decision in the Quock Walker case declared slavery unconstitutional in that state. Rhode Island and Connecticut more or less followed suit at about the same time, enacting “gradual emancipation” laws which said that children born to enslaved mothers after 1784 would not be enslaved for life, but could receive their freedom after some years: twenty-five years for Connecticut, and twenty-one years for Rhode Island.
            Still, it is important to bear in mind that our puzzlement about the 3/5ths compromise in the United States Constitution (where three fifths of slaves, though they cannot vote, are counted by slave states to inflate their representation in Congress) may be due to our assumption that surely those moral New Englanders (not to mention our hyper-moral founding fathers) would have fought such a travesty to the bitter end. What we now see is that they all had a deeper hand in the slave game, and for far longer, than most of us could have imagined. 

Lawrence DiStasi

Sunday, October 16, 2016

Long Live the Microbes

The twentieth century may well go down as the century of war against microbes, mainly the bacteria (and some viruses) that were found to be responsible for infectious diseases like malaria, cholera, tuberculosis, smallpox, and the common cold and flu. Most of these diseases are communicable, so much of human hygiene became the struggle to keep ourselves clean and free of “germs,” up to and including the overuse of antibiotic soaps that were the rage a few years ago. Chlorine bleach became as common in western kitchens as salt. More recently, however, we are being told that all this cleanliness that we thought next to godliness may not be so great for us after all. Rodney Dietert’s new book, The Human Superorganism (Dutton: 2016), takes this idea to the next level. Dietert, a professor of immunogenetics at Cornell, goes so far as to state that “human mammals are not really viable. As mammals devoid of microbial partners, we lack what we need to exist” (p. 28). In this view, all that cleaning with disinfectant and the overuse of antibiotics in both children and farm animals (chickens were given antibiotics in their feed because the normal chicken immune response against pathogens caused muscle loss—and muscle loss meant small chicken breasts, while producers, and consumers, wanted large breasts, i.e. lots of white meat) has led to an epidemic of non-communicable diseases (NCDs); the latter being the result of our war on what is literally an indispensable part of ourselves—the microbiome. That is, we are composed in large part of bacteria, something that the biologist Lynn Margulis pointed out years ago. Some estimates of bacterial cells in the human body range as high as 90 percent of the total cells. This astonishing number refers to the more than 10,000 different microbial species in and on our bodies from our gut to our mouths to our skin: “one square inch of our skin can contain up to six billion microorganisms!” (5). This, in short, is what Dietert means by calling humans a “superorganism”: we humans are made up of thousands of species (mostly bacterial), including the genetic information we carry, which is not, as we have been instructed most recently, just the 22,000 genes in our mammalian genome. No, Dietert points out, “ninety-nine percent of the genetic information within the space we call you is not from your genome” but from all those microbes in your microbiome. Ten million of them.
            Does this matter? It matters crucially, according to Professor Dietert. And the reason is that with deficiencies in our microbiome, countless processes that we need for digestion, for our immune systems, for our very preferences for food and our partners derive from the work of our microbiome. Without the full complement of microbes, that is, we humans are incomplete. And the consequence of being incomplete is the current epidemic of NCDs the world population is suffering from: obesity, autism, heart disease, asthma, and all the allergies and auto-immune diseases that plague many of us these days. Dietert quotes the World Health Organization to the effect that these NCDs now kill three times as many people as infectious diseases (68% to 23%). That’s because defects in the microbiome (occurring through drugs, diet, oversanitation, stress, and exposure to chemicals) lead directly to defects in the immune systems which determine what gets through to our cells, tissues and organs, and what does not.
            Is there scientific evidence that this is the case? Consider one study done by Derrick McFabe at the University of Western Ontario. McFabe demonstrated that simply by altering the concentration of one gut bacterial metabolite, the short-chain fatty acid known as propionic acid, he could make normal mice antisocial to the point where they ignored their littermates and obsessed on a ball. In the same vein, John Cryan and researchers at University College in Cork, Ireland, raised germ-free mice (bubble mice) and found that, lacking normal gut microbes, the mice had “altered gene expression” in the amygdala. They lack social cognition, are generally antisocial, and “have eerily similar social interaction profiles to those of autistic children” (247). As Dietert summarizes it, these metabolites (or products of bacterial metabolism), including such things as the vitamins B3, B5, B6, B12, and K, and critical brain chemicals like serotonin, dopamine, acetylcholine and norepinephrine, “can influence virtually every physiological system and tissue in the body” (70). They influence what we like to eat, whom we are attracted to, if anyone, and whether we can function normally or live our lives plagued by excessive inflammation leading to diseases like atherosclerosis. Again, as Dietert puts it, virtually every NCD (non-communicable disease), which is itself often due to loss of the full integrity of the microbiome, “has unhealthy inflammation with excessive oxidation at its core” (129).
            How does this happen, and why is it happening so much now? As noted above, modern conditions are major contributing factors to the compromising of our microbiome, our microbial partners. Dietert lists six factors contributing to the epidemic: 1) antibiotic overreach; 2) the food revolution and diet; 3) urbanization; 4) birth-delivery mode; 5) misdirected efforts at human safety; 6) mammalian-only medicine. Antibiotic over-use has received a great deal of attention in recent years, and many food producers have now cut back or eliminated the use of antibiotics in animal feed. But not all. And one of the areas where antibiotic use still reigns, as far as I can tell, is in its use to compensate for the obscene crowding that takes place in feedlots for cattle and in poultry cages. Disease cannot help but be rampant amid such intolerable crowding, and antibiotics are required to help keep diseases in check. In addition, doctors have routinely prescribed antibiotics for minor ailments like ear infections, and thus led to the compromise of not just the offending bacteria but to all the bacteria in and on our bodies. Use of probiotics like yogurt can help, but Dietert and others now recommend a major effort to limit the use antibiotics to only dire cases. One other area where antibiotics have been used routinely is in childbirth. Dietert makes much of the fact that Cesarean sections have increased exponentially in recent years, doubling and tripling in many countries. Today, Cesarean sections in the U.S. are up to 33% of all births, compared to 24% in England, 40% in parts of India, and 46% in China. One major result of this, according to Dietert, is that the newborn fails to get the contact with vaginal microbes that it needs to complete its own microbiome—so it is born incomplete. In addition, Cesarean sections run the risk of infection, so antibiotics are routinely employed pre-surgically, compromising the mother’s microbiome, and thereby her infant’s, even before birth.
            One other factor deserves mention here (though all of them are important). That is the role of chemicals with which our environment, our food, and our bodies have been flooded.  A simple thing like food emulsifiers (the chemicals that smooth things like ice cream, chewing gum, artificial tears), such as Polysorbate 80 and carboxymethylcellulose, have been “found to alter gut microbe populations by thinning the mucus layer and increasing inflammation” (167). In this experiment on mice, eventually the mucus thinning led to inflammation-driven NCDs in the mice. Bisphenol A, the infamous endocrine disruptor found in plastic baby bottles, food packaging and other plastics has also been found to compromise the gut barrier, again leading to increasing inflammation and NCDs. And that scourge of our planet, glyphosate, the chief agent in Monsanto’s Roundup pesticide, has been found to “selectively alter the ecosystem of environmental microbes (in the soil), such as favoring the formation of some types of biofilms” (256). This leads to reduction in the presence of some fungi needed by grass roots, which in turn leads to less foraging areas for ungulates. In a study of chicken gut microbes, pathogenic bacteria were found to be more resistant to glyphosate than were the helpful bacteria—thus helping the pathogens! And in Germany, glyphosate was found to affect the microbe mix in cows, reducing the normal protection their gut bacteria (Enterococcus) provided in restricting the growth of botulism spores. The result: an increase in botulism-related diseases in cattle in recent years.
            Enough said. Even if Dietert has overstated his case, it seems clear that a revolution in human health and food intake is under way. Dietert’s main recommendations call for a reduction in the flagrant use of antibiotics, a return to natural forms of childbirth where possible, and an increase in the use of probiotic foods. This gets to another point he makes: that traditional cultures, before the advent of cold food transport and frozen foods, used fermentation processes to keep foods from spoiling. Sauerkraut, kimchi, miso, tempeh, chichi, wines and beers, and of course yogurt were used by various cultures to keep foods edible, the side effect being that all of these foods have bacterial cultures that promote and restore the gut microbes we need to digest most of our foods. Some doctors have been for years recommending the use of yogurts after antibiotic treatment, precisely to restore the favorable gut fauna. Now Rodney Dietert’s book is providing the scientific rationale for even more attention to the microbes that foster our health and our very lives. It wouldn’t take much—except of course for the resistance sure to be mounted by industries like Big Pharma, which are quite content with the epidemics that keep millions sick and them wealthy. In that regard, we can expect lots of ‘scientific’ studies to prove that diseases like obesity, autism, asthma, and the rest are genetic. For myself, though, eating yogurt regularly and even preparing sauerkraut in my own kitchen seems like a much better option—until, that is, the world wakes up and outlaws GMOs, glyphosate, and Monsanto, and many others like them. At the least, it’s worth thinking about.

Lawrence DiStasi

Monday, October 10, 2016

The Creep Factor

After watching the second presidential debate last night—and it wasn’t easy to stay tuned—it occurred to me that almost aside from everything that was said, the real takeaway from this insult fest was the singular creepiness on display. And here I am referring not to the sometime creepiness of Hillary, though there is something of that in her, but the unparalleled creepiness of Donald Trump. The Drumpf is a really creepy human being—if, in fact, he even qualifies as human. This is what sticks in mind from last night’s debate: The Donald prowling the stage in his overly large blue suit and red power tie, flaunting his straw hair and paunchy gut and floridly pockmarked face, sniffing repeatedly as if with a stuffy nose, and generally creeping out everyone nearby; with Hillary trying to get as far from him as she possibly could, and even the moderators, Martha Radatz and Anderson Cooper, indicating from their body language and constant frowns that they too hoped he wouldn’t get any closer. The man is a walking disease—though I am quite happy to admit that the latest bombshell to hit his campaign, the release of his obscene gloating with one of the little Bushies about his ability to molest all the beauties he encounters because as a star he is allowed to even grab their pussies—may have had its influence. I mean, who would want to be within even hearing distance of such a creep?
            And yet, he is the presidential candidate of one of the two major parties. And yet, his popularity ratings continue no matter what he says or does. How is one to understand this? To what can it be compared?
            In a way, to make any reasonable comparison to the possibility of such a creep becoming president, one has to go back to the Romans. I mean the Roman emperors like Tiberius and Caligula and Nero. To be sure, it was Augustus who initiated the damage; or perhaps it was Caesar before him; the damage being the discarding of the ancient Republic in favor of an empire ruled by an Emperor. Absolute rule, in other words, with the Senate as simply a rubber stamp. But Augustus more or less tempered his dictatorship with the democratic traditions he grew up with. Not so his successors. With increasing savagery, Tiberius and Nero and Caligula introduced a barbarism, a licentiousness, a cruelty and perversion that mass audiences became familiar with in the PBS dramatization of Robert Graves’ hit, I Claudius. I don’t know, perhaps Donald Trump himself watched and was inspired by the show (perhaps that’s where he got the idea of prosecuting Hillary once he’s elected, something dictators do routinely to their political opponents). Or perhaps he’s just naturally a megalomaniac, a homegrown, dyed-in-the-wool creep. However it happened, Trump’s creepiness, on full display last night, can only be compared to those ancient Roman creeps who cemented the full and total abandonment of democracy in ancient Rome and ushered in the imperial debauchery that has since become our standard for the unbridled government power that corrupts absolutely.
            Beyond just the nausea, though, what also occurred to me is that, like Claudius, we may be witnessing the ‘hatching out’ of all the imperial poisons that have been growing in our own alleged democracy in recent years. Trump may be the emblem, the representative, the payback for all the sins of our fathers, in other words; sins that, though long denied by the national myth, have become increasingly apparent. Far from being conceived innocently as the great “shining city on a hill,” that is, the United States of America was founded on slaughter and slavery. The slaughter of the native Americans encountered by those intrepid colonists at Plymouth Rock and Boston seems every day more clear and irredeemable (see the recent book by Wendy Warren, New England Bound: Slavery and Colonization in Early America). Nor did it end with the New England wars, King Philip’s War and the Pequot War, both of which ended with the few surviving Indians enslaved and shipped to Caribbean sugar plantations. The wars against native Americans lasted well into recent history, culminating only a few years ago with the imprisonment of Leonard Peltier. As to the slavery, that has not ended either—even with the Emancipation Proclamation—but only shifted emphasis, first to the horrors of Jim Crow in the South, and then morphing into the Drug Wars of the 1980s that solved the ‘problem’ of black men by putting most of them in prison. But instead of solving the problem, the endless cultural disease spawned by slavery has only come back to haunt us again and again. Not even a black president has made a difference; on the contrary, the Obama presidency seems only to have stoked the fires of white racist resentment to burn even hotter in recent years. And in response to accusations that unresolved racism is behind the commonplace police killings of black men, we get even more resentment and racism, now openly expressed on the internet, which is itself sanctioned by the thinly veiled racism of a major presidential candidate and his openly racist followers. And it is truly creepy.
            So this is what last night’s debate left me—and, I gather, many others around the nation—with: the need to take a long, hot, disinfecting shower. The need to distance myself and those I love from this creep who is pretending to the presidency. I’m not sure if some of the citizens of Rome or those in the Weimar Republic had this same feeling. But I would guess they did. What alarms me is that neither their numbers, in either case, nor their revulsion and disgust were enough. The creeps in ancient Rome and the little creep in Germany were able to con sufficient numbers (by redirecting the wrath of the masses at convenient scapegoats) to reach the heights of power, at least for a time. Trump, even wounded as he is, may be able to do the same—especially if large numbers of us respond to the creepiness by casting a pox on both their houses and refusing to vote at all. That would be the fulfillment of Republican hopes and strategy, and the great mistake of all time, but I fear in this season of amazing improbables, it may be possible. And that is a truly terrifying thought.

Lawrence DiStasi