
The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power
by Shoshana Zuboff
30 popular highlights from this book
Key Insights & Memorable Quotes
Below are the most popular and impactful highlights and quotes from The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power:
The real psychological truth is this: If you’ve got nothing to hide, you are nothing.
Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data.
it is no longer enough to automate information flows about us; the goal now is to automate us.
Industrial capitalism transformed nature’s raw materials into commodities, and surveillance capitalism lays its claims to the stuff of human nature for a new commodity invention. Now it is human nature that is scraped, torn, and taken for another century’s market project. It is obscene to suppose that this harm can be reduced to the obvious fact that users receive no fee for the raw material they supply. That critique is a feat of misdirection that would use a pricing mechanism to institutionalize and therefore legitimate the extraction of human behavior for manufacturing and sale. It ignores the key point that the essence of the exploitation here is the rendering of our lives as behavioral data for the sake of others’ improved control of us. The remarkable questions here concern the facts that our lives are rendered as behavioral data in the first place; that ignorance is a condition of this ubiquitous rendition; that decision rights vanish before one even knows that there is a decision to make; that there are consequences to this diminishment of rights that we can neither see nor foretell; that there is no exit, no voice, and no loyalty, only helplessness, resignation, and psychic numbing; and that encryption is the only positive action left to discuss when we sit around the dinner table and casually ponder how to hide from the forces that hide from us.
What would hold society together in the absence of the rules and rituals of clan and kin? Durkheim’s answer was the division of labor.
Two men at Google who do not enjoy the legitimacy of the vote, democratic oversight, or the demands of shareholder governance exercise control over the organization and presentation of the world’s information. One man at Facebook who does not enjoy the legitimacy of the vote, democratic oversight, or the demands of shareholder governance exercises control over an increasingly universal means of social connection along with the information concealed in its networks.
Imagine you have a hammer. That’s machine learning. It helped you climb a grueling mountain to reach the summit. That’s machine learning’s dominance of online data. On the mountaintop you find a vast pile of nails, cheaper than anything previously imaginable. That’s the new smart sensor tech. An unbroken vista of virgin board stretches before you as far as you can see. That’s the whole dumb world. Then you learn that any time you plant a nail in a board with your machine learning hammer, you can extract value from that formerly dumb plank. That’s data monetization. What do you do? You start hammering like crazy and you never stop, unless somebody makes you stop. But there is nobody up here to make us stop. This is why the “internet of everything” is inevitable.
This is the existential contradiction of the second modernity that defines our conditions of existence: we want to exercise control over our own lives, but everywhere that control is thwarted. Individualization has sent each one of us on the prowl for the resources we need to ensure effective life, but at each turn we are forced to do battle with an economics and politics from whose vantage point we are but ciphers. We live in the knowledge that our lives have unique value, but we are treated as invisible. As the rewards of late-stage financial capitalism slip beyond our grasp, we are left to contemplate the future in a bewilderment that erupts into violence with increasing frequency. Our expectations of psychological self-determination are the grounds upon which our dreams unfold, so the losses we experience in the slow burn of rising inequality, exclusion, pervasive competition, and degrading stratification are not only economic. They slice us to the quick in dismay and bitterness because we know ourselves to be worthy of individual dignity and the right to a life on our own terms.
Uncertainty is not chaos but rather the necessary habitat of the present tense. We choose the fallibility of shared promises and problem solving over the certain tyranny imposed by a dominant power or plan because this is the price we pay for freedom to will, which founds our right to the future tense. In the absence of this freedom, the future collapses into an infinite present of mere behavior, in which there can be no subjects and no projects: only objects.
We are no longer the subjects of value realization. Nor are we, as some have insisted, the “product” of Google’s sales. Instead, we are the objects from which raw materials are extracted and expropriated for Google’s prediction factories. Predictions about our behavior are Google’s products, and they are sold to its actual customers but not to us. We are the means to others’ ends.
Google is a shape-shifter, but each shape harbors the same aim: to hunt and capture raw material. Baby, won’t you ride my car? Talk to my phone? Wear my shirt? Use my map? In all these cases the varied torrent of creative shapes is the sideshow to the main event: the continuous expansion of the extraction architecture to acquire raw material at scale to feed an expensive production process that makes prediction products that attract and retain more customers.
Mass production was aimed at new sources of demand in the early twentieth century’s first mass consumers. Ford was clear on this point: “Mass production begins in the perception of a public need.”73 Supply and demand were linked effects of the new “conditions of existence” that defined the lives of my great-grandparents Sophie and Max and other travelers in the first modernity. Ford’s invention deepened the reciprocities between capitalism and these populations. In contrast, Google’s inventions destroyed the reciprocities of its original social contract with users. The role of the behavioral value reinvestment cycle that had once aligned Google with its users changed dramatically. Instead of deepening the unity of supply and demand with its populations, Google chose to reinvent its business around the burgeoning demand of advertisers eager to squeeze and scrape online behavior by any available means in the competition for market advantage. In the new operation, users were no longer ends in themselves but rather became the means to others’ ends. Reinvestment in user services became the method for attracting behavioral surplus, and users became the unwitting suppliers of raw material for a larger cycle of revenue generation.
Right now, however, the extreme asymmetries of knowledge and power that have accrued to surveillance capitalism abrogate these elemental rights as our lives are unilaterally rendered as data, expropriated, and repurposed in new forms of social control, all of it in the service of others’ interests and in the absence of our awareness or means of combat. We have yet to invent the politics and new forms of collaborative action—this century’s equivalent of the social movements of the late nineteenth and twentieth centuries that aimed to tether raw capitalism to society—that effectively assert the people’s right to a human future. And while the work of these inventions awaits us, this mobilization and the resistance it engenders will define a key battleground upon which the fight for a human future unfolds.
Research by media scholars Daniel Kreiss and Philip Howard indicates that the 2008 Obama campaign compiled significant data on more than 250 million Americans, including “a vast array of online behavioral and relational data collected from use of the campaign’s web site and third-party social media sites such as Facebook.…”96 Journalist Sasha Issenberg, who documented these developments in his book The Victory Lab, quotes one of Obama’s 2008 political consultants who likened predictive modeling to the tools of a fortune-teller: “We knew who… people were going to vote for before they decided.
Google’s ideal society is a population of distant users, not a citizenry. It idealizes people who are informed, but only in the ways that the corporation chooses. It means for us to be docile, harmonious, and, above all, grateful.
First, there is hardly an innocent app; if it’s not tracking you now, it may be doing so in the next week or month: “There is an entire industry based upon these trackers, and apps identified as ‘clean’ today may contain trackers that have not yet been identified. Tracker code may also be added by developers to new versions of apps in the future.” Second is that even the most innocent-seeming applications such as weather, flashlights, ride sharing, and dating apps are “infested” with dozens of tracking programs that rely on increasingly bizarre, aggressive, and illegible tactics to collect massive amounts of behavioral surplus ultimately directed at ad targeting. For example, the ad tracker FidZup developed “communication between a sonic emitter and a mobile phone. . . .” It can detect the presence of mobile phones and therefore their owners by diffusing a tone, inaudible to the human ear, inside a building: “Users installing ‘Bottin Gourmand,’ a guide to restaurants and hotels in France, would thus have their physical location tracked via retail outlet speakers as they move around Paris.
Personal information is increasingly used to enforce standards of behavior. Information processing is developing, therefore, into an essential element of long-term strategies of manipulation intended to mold and adjust individual conduct.”34
the founder embraced the button only when new data revealed it as a powerful source of behavioral surplus that helped to ratchet up the magnetism of the Facebook News Feed, as measured by the volume of comments.34
Facebook’s own North American marketing director, Michelle Klein, who told an audience in 2016 that while the average adult checks his or her phone 30 times a day, the average millennial, she enthusiastically reported, checks more than 157 times daily. Generation Z, we now know, exceeds this pace. Klein described Facebook’s engineering feat: “a sensory experience of communication that helps us connect to others, without having to look away,” noting with satisfaction that this condition is a boon to marketers. She underscored the design characteristics that produce this mesmerizing effect: design is narrative, engrossing, immediate, expressive, immersive, adaptive, and dynamic.11 If you are over the age of thirty, you know that Klein is not describing your adolescence, or that of your parents, and certainly not that of your grandparents. Adolescence and emerging adulthood in the hive are a human first, meticulously crafted by the science of behavioral engineering; institutionalized in the vast and complex architectures of computer-mediated means of behavior modification; overseen by Big Other; directed toward economies of scale, scope, and action in the capture of behavioral surplus; and funded by the surveillance capital that accrues from unprecedented concentrations of knowledge and power. Our children endeavor to come of age in a hive that is owned and operated by the applied utopianists of surveillance capitalism and is continuously monitored and shaped by the gathering force of instrumentarian power. Is this the life that we want for the most open, pliable, eager, self-conscious, and promising members of our society?
Futuristic as this may sound, the vision of individuals and groups as so many objects to be continuously tracked, wholly known, and shunted this way or that for some purpose of which they are unaware has a history. It was coaxed to life nearly sixty years ago under the warm equatorial sun of the Galapagos Islands, when a giant tortoise stirred from her torpor to swallow a succulent chunk of cactus into which a dedicated scientist had wedged a small machine. It was a time when scientists reckoned with the obstinacy of free-roaming animals and concluded that surveillance was the necessary price of knowledge. Locking these creatures in a zoo would only eliminate the very behavior that scientists wanted to study, but how were they to be surveilled? The solutions once concocted by scholars of elk herds, sea turtles, and geese have been refurbished by surveillance capitalists and presented as an inevitable feature of twenty-first-century life on Earth. All that has changed is that now we are the animals
There are many buzzwords that gloss over these operations and their economic origins: “ambient computing,” “ubiquitous computing,” and the “internet of things” are but a few examples. For now I will refer to this whole complex more generally as the “apparatus.” Although the labels differ, they share a consistent vision: the everywhere, always-on instrumentation, datafication, connection, communication, and computation of all things, animate and inanimate, and all processes—natural, human, physiological, chemical, machine, administrative, vehicular, financial. Real-world activity is continuously rendered from phones, cars, streets, homes, shops, bodies, trees, buildings, airports, and cities back to the digital realm, where it finds new life as data ready for transformation into predictions, all of it filling the ever-expanding pages of the shadow text.4
Google had discovered a way to translate its nonmarket interactions with users into surplus raw material for the fabrication of products aimed at genuine market transactions with its real customers: advertisers.94 The translation of behavioral surplus from outside to inside the market finally enabled Google to convert investment into revenue. The corporation thus created out of thin air and at zero marginal cost an asset class of vital raw materials derived from users’ nonmarket online behavior.
Google had unilaterally undertaken to change the rules of the information life cycle when it decided to crawl, index, and make accessible personal details across the world wide web without asking anyone’s permission.
It was there, in Spain, that the right to the future tense was on the move, insisting that the operations of surveillance capitalism and its digital architecture are not, never were, and never would be inevitable. Instead, the opposition asserted that even Google’s capitalism was made by humans to be unmade and remade by democratic processes, not commercial decree. Google’s was not to be the last word on the human or the digital future.
Birds, bees, butterflies… nests, holes, trees, lakes, hives, hills, shores, and hollows… nearly every creature shares some version of this deep attachment to a place in which life has been known to flourish, the kind of place we call home. It is in the nature of human attachment that every journey and expulsion sets into motion the search for home. That nostos, finding home, is among our most profound needs is evident by the price we are willing to pay for it.
users are not products, but rather we are the sources of raw-material supply. As we shall see, surveillance capitalism’s unusual products manage to be derived from our behavior while remaining indifferent to our behavior. Its products are about predicting us, without actually caring what we do or what is done to us.
One consequence of the new density of social comparison triggers and their negative feedback loops is a psychological condition known as FOMO (“fear of missing out”). It is a form of social anxiety defined as “the uneasy and sometimes all-consuming feeling that . . . your peers are doing, in the know about, or in possession of more or something better than you.”51 It’s a young person’s affliction that is associated with negative mood and low levels of life satisfaction.
The corporation’s ability to hide this rights grab depends on language as much as it does on technical methods or corporate policies of secrecy. George Orwell once observed that euphemisms are used in politics, war, and business as instruments that “make lies sound truthful and murder respectable.”81
we can recognize that over the centuries we have imagined threat in the form of state power. This left us wholly unprepared to defend ourselves from new companies with imaginative names run by young geniuses that seemed able to provide us with exactly what we yearn for at little or no cost. This new regime’s most poignant harms, now and later, have been difficult to grasp or theorize, blurred by extreme velocity and camouflaged by expensive and illegible machine operations, secretive corporate practices, masterful rhetorical misdirection, and purposeful cultural misappropriation.
This unprecedented concentration of knowledge produces an equally unprecedented concentration of power: asymmetries that must be understood as the unauthorized privatization of the division of learning in society.