Some of you may recall a John Prine song popular in the 1970s titled, “Blow Up Your TV.” The first two lines are: “Blow up your TV/ Throw away your papers.”
That song came to mind after I watched the terrific documentary, The Social Dilemma, because essentially the same advice is coming from people like Jaron Lanier, featured in the film. Lanier wrote a book titled: Ten Arguments for Deleting Your Social Media Accounts Right Now, only that Lanier is not a folksinger; he’s a computer scientist who may know more about computer technology than anyone around. And what he, along with many others, is now recommending is this: ‘get rid of your social media apps now; they’re a danger to everything we claim to value.’
To be sure, not everyone in this film offers such extreme advice. But they come close. For the ‘social dilemma’ of the title refers to the dilemma we all face with a technology that has brought us major benefits and advances in communication, but whose basic design and workings have brought us and our putative freedom great harm that is growing greater by the minute. And what makes it so persuasive is that the major voices sounding the alarm are almost all veterans of the media giants of Silicon Valley like Facebook, Twitter, Google, Youtube, Pinterest, Instagram, etc. They include Lanier, already mentioned; and Tristan Harris—founder of The Center for Humane Technology, CHT—who was a Design Ethicist at Google; Justin Rosenstein, also a former engineer at Google; Tim Kendall, formerly of Facebook and Pinterest; Aza Raskin, formerly of Firefox and co-founder of CHT; plus a galaxy of experts in related fields like Roger NcNamee, venture capitalist, and Dr. Shoshana Zuboff of Harvard, author of The Age of Surveillance Capitalism, and Dr. Ann Lembke, of Stanford Medical School. This does not exhaust the list, but it gives one an idea of the heft of these critics, and their inside knowledge of what they’re criticizing. And what they’re basically saying is that, while we ‘users’ of computers and smartphones and their social media programs think we’re getting to choose all these neat services free, they are anything but free. To quote Aza Raskin,
“If you’re not paying for the product, then you are the product.”
To which Jaron Lanier adds that it’s worse than that: it’s not just your attention that these giants want as their product:
“It’s the gradual, slight, imperceptible change in your own behavior and perception that is the product.”
Think about that. You are not only an object—a product being sold—but the target of a sophisticated campaign to change your behavior so that you become less and less capable of looking away from that screen, and thus more likely to look at what advertisers want to sell you. The certainty of your attention is what makes so much money for social media companies. As Shoshana Zuboff says, “That’s their business; they sell certainty.” The certainty Zuboff means is the certainty that you will watch their ads, and their ability to predict that some of you, even a small percentage, will buy what they’re selling. Those predictions are worth billions, trillions.
To get that absolute reliability, media companies like Google and Facebook employ AI, artificial intelligence made possible by supercomputers, that compiles mountains of data. This data is compiled in such sophisticated ways that your every move is recorded and known, as well as what you will likely do. As Jeff Seibert, a former executive at Twitter puts it: “Every single action you take is carefully monitored and recorded.” And what do they do with all that data that they so assiduously mine? Aza Raskin answers: “They build models that predict our actions, and whoever has the best model wins.” In the film, this is made graphic by showing an avatar for some of the people being monitored. And the goals of all this data mining (some would call it “hacking our minds”) are threefold a) to guarantee one’s engagement (that is, what gets your attention); b) to foster growth (that is, get you to invite friends); c) to sell advertising (that is, to make as much money as possible from companies that pay a lot for that certainty, that predictability). Jaron Lanier sums this up:
“We’ve created an entire global generation for whom the very meaning of communication, of culture, is manipulation…We’ve put deceit and sneakiness at the center of everything we do.”
That’s because while users think they are making their own choices, and getting all this cool information free, in fact, they are the object of a massive and powerful technological complex, whose aim is to drive them to make the choices that will produce the most certainty, and the most money. And all in the service of supplying buyers for the useless products being sold.
The problem with all this is that you, the user, have no idea you are being manipulated. Tristan Harris likens the smartphone to a slot machine that gives the gambler just enough hits to get him habituated, or hooked. “You are a lab rat,” says Tristan Harris. We’re all lab rats in this huge experiment to see what colors and patterns and posts work to grab our attention and keep it; and every bit of brain science and psychology available is fed into AI and those who run it to see what manipulation works best. Chamath Palihapitiya, former VP of Growth for Facebook puts it this way:
“So we want to figure out psychologically how to manipulate you as fast as possible, and then give you back that dopamine hit.”
Dopamine, of course, is the brain neurotransmitter involved in giving us rewards, and thus pleasure, and is crucial in motivation, memory and attention. It is the target of enormous attention by media experts in getting people hooked on their devices. And all of it—the data mining, the brain hacking, the profit motive—is done behind the curtain, without the user’s knowledge. Sneaky. The film adds an important statement by Edward Tufte:
There are only two industries that call their customers ‘users’: illegal drugs and software.
And Dr. Ann Lembke, of the Stanford School of Medicine, confirms the rightness of this metaphor:
“So here’s the thing: social media is a drug. I mean, we have a basic biological imperative to connect with other people.”
Does this use of media as an addictive drug have real world effects? You can bet on it. First of all, it means that these computerized algorithms know more about what we are likely to do than even we do. That’s predictability on steroids. Then consider Jonathan Haidt’s statistics, shown on a graph in the film. First, U.S. Hospital Admissions for Non-Fatal Self Harm: girls 15-19: up 62% since 2008. girls 10-14: up 189% since 2008. Second, Suicide Rates for the same girls: ages 15-19: up 70%. pre-teens: up 151%. Haidt sums it up this way: “A whole generation is now more fragile, anxious, more depressed—and much less comfortable taking risks.”
This is all worrying, in many ways terrifying. But the most revelatory segment for me was the one about how these technologies are now confirming, indeed, abetting the massive increase in disinformation. And in the U.S. at least, usually not for nefarious reasons; it’s all in the service of getting people more addicted, and thus more susceptible to advertising. Some might see this as benign: ‘it’s just the marketplace operating.’ I do not. Because at least with a dictator, one knows one is being manipulated; here, no one seems to have any idea, and so all of us are less guarded, more susceptible, more resigned. This becomes very important when we learn that these platforms actually tailor their responses to what they know the user wants to see or hear. Justin Rosenstein:
“When you go to Google and type climate change, you’re going to see different results depending on where you live.”
WHAT!! Yes, you read that right. If you live in a conservative zip code, you might see several entries saying climate change is a hoax, or is not scientifically verified; if you live in a more liberal zip code, you’re likely to be fed entries confirming its validity, its human cause, and the dangers of melting icecaps. And it’s not just Google. Your Facebook newsfeed is driven by this same manipulation, and it’s even more powerful. Tristan Harris points out that Facebook provides each person with a different news feed, based on the computer’s calculation of what fits with that person’s known interests. So, if data shows you have shown a preference for conspiracies in the past, you’ll get that reinforced with posts alleging conspiracy theories and related subjects; if you’ve shown an interest in weapons or philosophy or left-wing or right-wing activism, you’ll get those reinforced—and all in the interest not of conspiracy theories or activism, but of getting you more hooked. Roger McNamee points out the perils of this:
“Each person [i.e. on social media] has their own reality, their own facts…Over time, you have the false sense that everyone agrees with you because everyone in your news feed sees the same—and once you’re in that state, it turns out you’re more easily manipulated.”
Now we see how the polarization that is so disturbing to our politics comes about, and gets reinforced daily. Social media is strengthening, is in fact shaping that very polarization! We see this when we learn that the people one wonders about—the ones who believe outlandish theories such as QAnon or Pizzagate—are simply not seeing the same information as someone who reads the NY Times or the Washington Post. So their beliefs are constantly reinforced by seeing more of what fits their world view. Perhaps the worst part of this manipulation is that fake news is now known—via a study of Twitter—to spread six times faster than real, that is, verified news. Sandy Parakilas makes this vivid:
“We’ve created a system that favors false information not because we wanted to, but because false information makes more money.”
If you’re not outraged by now, you need to get your outrage organ checked. But I digress. This problem now affects every aspect of our society, and it is only getting worse. And, as noted above, the worst part is that it’s all driven by the greed of Silicon Valley corporations, their perceived need to “monetize,” i.e. make bundles of money. Tristan Harris puts it this way: “It’s a disinformation-for-profit business model.” Disinformation for profit. And this is leading us, all unknowing, to a situation where there is no longer a consensus based on agreed-upon truth. There is no longer even agreement about scientifically-validated truth. Everything is up for grabs. Every fact is in dispute. Which we see right now in the Trump-disputed election—not based on any evidence or facts, but on shared conspiracies reinforced by social media.
Is this serious? Tristan Harris again: “We in the tech industry have created the tools to destabilize and erode the fabric of society, and in every country of the world.” Jaron Lanier adds. “If we keep on doing this, we probably destroy our civilization.”
And one can’t help wondering, why do we keep on doing this? Why isn’t something being done? Why isn’t this an urgent topic for regulators, for the U.S. Congress? And the answer is, Harris and others have testified about these issues before Congress not once, but twice: once before the Senate on June 25, 2019, and once before the Subcommittee on Consumer Protection and Commerce, on January 8, 2020. The latter hearing was entitled, “Americans at Risk: Manipulation and Deception in the Digital Age.” And what has been the result? As far as I can tell, nothing. Nada. Zip. Which points to the essential problem: The problem lies with the fundamental business model. That model, which Harris characterizes as “disinformation for profit,” says that it’s ok to fleece and manipulate and deceive consumers, to lead young girls into depression and suicide, to undermine the very fabric of democratic society and of humanity itself, of truth itself, as long as it’s in the interest of making money. In short, where so many trillions of dollars are concerned, virtually anything is permitted. And those dollars virtually guarantee that nothing will be done by lawmakers.
Unless, that is, the public, millions of concerned citizens and parents of the generations now having their brains and lives hacked, rise up with indignation and outrage over what is being done to them and theirs. Then, and only then, will this Frankenstein’s monster we’ve created be regulated, and controlled, and made to serve not profit, but human beings.
One can only hope that that day comes in time. But based on what has happened so far, there is little cause for optimism—unless, again, enough people watch the documentary (available on Netflix), or join the Center for Humane Technology (www.humanetech.com ), and, having absorbed its message, take to the phones, the airwaves, the streets with their anger and outrage on full display. If that were to happen, the whole pernicious system could be forced to change.
But there isn’t much time. The freedom of every mind, of whole generations to be able to make their own decisions, hangs in the balance.
Lawrence DiStasi
No comments:
Post a Comment