Too Smart: How Digital Capitalism Is Extracting Data, Controlling Our Lives, and Taking Over the World (Sadowski, Jathan)


Highlights


No check out necessary; your account will be autobilled. In return for this convenience, Amazon will track your location and behavior via hundreds of small cameras throughout the store equipped with analytics software—where you browse, how long you linger, and what you pick up and put back.6


“Amazon Choice: Toss away the grocery list. We already know what you want!”


At a public talk in early 2017, Andrew Ng, an artificial intelligence researcher who has held top positions at Google, Baidu, and Coursera, was explicit about the primacy of data collection: “At large companies, sometimes we launch products not for the revenue, but for the data. We actually do that quite often . . . and we monetize the data through a different product


This insatiable hunger for data arises from the fact that data is now a form of capital, just like money and machinery.


first argued in an article for the journal Big Data and Society, the collection and circulation of data as capital is now a core feature of contemporary capitalism.


  1. Data is used to profile and target people.


  1. Data is used to optimize systems.


  1. Data is used to manage things.


Data is used to grow the value of assets. Things like buildings, infrastructure, vehicles, and machinery are depreciating assets. They lose value over time as the forces of entropy—or wear and tear—take their toll. Yet upgrading assets with smart tech that collects data about their use helps combat the normal cycle of deterioration. They become, as financier Stuart Kirk states, “more adaptive and responsive, thereby extending their useful lives.”23 Rather than depreciating, smartified assets can maintain and gain value. Or if they don’t grow value, at least data can slow its decay.


According to the champions of digital capitalism, one of the major impediments to its development is that many companies do not yet “fully appreciate” that “their data is their single biggest asset.”


This isn’t because they are only gathering small amounts of data—only what’s needed, and nothing more—but rather because the amount harvested can always be bigger.


The symbol of disciplinary power is the panopticon: a structure built in such a way that you know you could always be watched, but you never know when or if you are being watched, so you always behave as if you are. Discipline trades on equal parts paranoia, guilt, and shame.


The health wristband records your vital signs. We now have a thousand identities, each represented by a thousand data points collected by a thousand devices. “The human body,” as sociologists Kevin Haggerty and Richard Ericson explain, “is broken down by being abstracted from its territorial setting. It is then reassembled in different settings through a series of data flows. The result is a decorporealized body, a ‘data double’ of pure virtuality.”


Living in a smart society means always being divided, being further dividualized.


We’re not dealing with a technopolitical system that is outside capitalism or an aberration of it. In many ways, it’s just variations of the same old capitalism, but now running on some new hardware and software. And that brings me to the first thesis.


The operations of capital are adapting to the digital age, while also still maintaining the same essential features of exclusion, extraction, and exploitation.5


An essential reading list includes, but is not limited to, Dark Matters (2015), Digital Sociologies (2016), The Intersectional Internet (2016), Programmed Inequality (2017), Algorithms of Oppression (2018), and Automating Inequality (2018).7


They are further spread by a culture that doesn’t recognize how it reproduces inequity and exclusion, and doesn’t have enough incentive to do things differently because the people who benefit from the status quo are the same people who make decisions about what to build and how to use it.


Yet, computing in the service of powerful interests, be they state or corporate, tends to inculcate stereotypes and static identities appropriate to reifying and perpetuating forms of existing power. The purpose of these systems is to discipline information in the service of a particular goal. In order to increase their own efficiency and power, such systems must stylize reality and translate it into an informational landscape where it can be acted upon in a seemingly frictionless, disinterested, and unbiased way. In point of fact, however, this process of rendering information computable relies on institutionalizing the views and biases of those constructing the system, and reflexively serves their ends.


Thus the rise of the digital age is not a disruptive break from history. It is a new way of repackaging, reproducing, and revitalizing what came before. We have to look beyond the high-tech veneer that covers up the machinations of old power regimes.9 The more things change, the more they stay the same.


Common practices of data collection should be seen as theft and/or exploitation.


Much of the valuable data harvested from the world is about people: our identities, beliefs, behaviors, and other personal information. This means collecting data often goes hand in hand with increasingly invasive systems for probing, tracking, and analyzing people.


Following in the footsteps of other extractive enterprises through capitalism’s history, such as land grabs and resource mining, data is taken with little regard for meaningful consent and fair compensation for the people who that data is about.


EULAs are known as “boilerplate” contracts because they are generically applied to all users. They are one-sided, nonnegotiated, and nonnegotiable. They are long, dense legal documents, designed not to be read. You either unquestioningly agree or you are denied access. “It is hard, therefore, to consider them to be free and voluntary arrangements since one party has no power to enact their demands,” explains political economist Kean Birch.19 These companies don’t seek consent; they demand compliance. This is far from the standard of informed, active consent that actually preserves our choice and autonomy.


Compensation most often comes in the form of access to services like Facebook’s platform and Google’s search engine. Rather than charging money to use the service, the owner collects data as payment.


What he found is that people were not indifferent to privacy concerns but instead frequently expressed feelings of being “powerless” to do anything about data collection. As one participant said, “My biggest thing from loss of privacy isn’t about other people knowing information about you but kind of being forced or bribed to share your information.”


That sense of powerlessness is intensified by a lack of knowledge about when, how, or why data collection happens, and what the consequences might even be. Another participant noted, “We really don’t know where things collected about us go—we don’t understand how they interact in such a complex environment.”23


No matter how high tech their methods and how polished their public image, the enterprises at the heart of digital capitalism are extractive. They recognize the value of data, and take advantage of political and legal systems that protect their rights to property and profit over our rights to consent and compensation. We should be eliminating corporate thievery, not tolerating yet another type.


All abstractions are only representations or simulations of a thing, not the thing itself.


Datafication is, like all ways of scrutinizing and sorting the world, also a way of exercising power over the world.26


The language that critical scholars and practitioners use to discuss datafication—data grabbing, data colonialism, data extraction—is a lexicon of violence. Instead of dancing around this relationship, we ought to be clear and explicit about the violence involved in abstracting human life, identity, and action. Not all violence is equally severe, but it is nonetheless still violence.


Our understanding of violence tends to be so parochial, with its focus on direct physical force, that it is almost hard to imagine violence on the scale that can be—and indeed, has been—carried out through datafication. Datafication is abstraction, but its results are not always abstract.


Platforms are the new landlords of digital capitalism.


The solutionist ideology works backward: those in the business of selling solutions need solvable problems, and thus every problem is framed in a way that justifies the solutions that are readily available, fit a certain frame of the world, and/or most benefit the seller. Invention becomes the mother of necessity.


For technocrats, all human values can be ignored, downplayed, or recast as technical parameters. Any trade-offs and assumptions are hidden inside simplistic cost-benefit analysis. There are no valid disagreements to be had, only obvious decisions to be made. The question is never “why” do something, but “how” to do it. They do things not because they should but instead because they can.


Not surprisingly, the modern technocrat tends to focus on problems and solutions that concern their idiosyncratic desires and dislikes, or improves their net worth and influence, all while making claims about “saving the world.”


We are promised a futuristic society that is engineered to run like a hyperefficient machine. All we have to do is hand over control to tech corporations and trust in the benevolence of capitalism with Silicon Valley characteristics.


Like other political projects, building the smart society is a battle for our imagination.


Telling stories about the future is a way of shaping the present.


While tech companies claim to be fiercely innovative and disruptive, driven by a desire to change the world, they never actually sketch a range of alternative futures. They eschew truly radical visions that might challenge the status quo or their position. Instead, they offer a curated selection of solutions and scenarios with the aim of establishing their version of a smart society as the future—the only one available or possible.


“It is easier to imagine the end of the world than to imagine the end of capitalism.”


“A structure where one man gets to decide whether hundreds of thousands of people will be able to feed their children or pay rent is intolerable,”


Capitalism—in any variety, digital or otherwise—is a system that is both intolerable and enduring, but not inevitable.


The fight for our imagination—and the world it gives shape to—is not yet over.


No mindfulness app, which reminds you to take a few moments to sit quietly every day, is going to steel you against a political economic system designed to concentrate power in the hands of a few.


These things might help us tolerate the worst excesses of digital capitalism—to be sure, every little bit of relief helps—but they aren’t going to change the system.


The “starter interrupt device” allows auto lenders to track the location of cars, both in real time and over time, and remotely shut off vehicles if the borrower falls behind on payments (sometimes by only a day) or drives outside an approved area. There’s no escaping debt collectors who can, with the push of a button on their smartphones, disable your car until you cough up payment. As one lender said, “I have disabled a car while I was shopping at Walmart.”4 No effort, no stealth, no confrontation required.


For those in the West, it is easy to claim that Zhima Credit and the SCS are products of an authoritarian regime.


Modern Times ingeniously skewered work and life in industrial society. Even today it remains a masterful film, but what if we made a sequel called Contemporary Times about work in a smart society? What would it look like? Instead of a factory in the 1930s, perhaps it would be set in the early twenty-first century’s version of a brutally exploitative work place: an Amazon warehouse.


“All you people care about is the rates, not the well-being of the people. I’ve never worked for an employer that had paramedics waiting outside for people to drop because of the extreme heat.”55


While smart tech allows us to quantify and access information about ourselves, that data is not then locked away in our own private vaults. The smart self exists in databases owned by others.


“This thing is addressing problems that don’t exist.


Finding ways to squeeze more value from workers is a powerful motivation for innovation. In response, workers have always searched for ways to slow down the pace of work, reclaim some of the value they produce, and exercise their human agency.


“Within the same room, machines were smashed or spared according to the business practices of their owners.”


Luddism was motivated by factory owners using machines to drastically increase productivity targets, accelerate the pace of work, and squeeze more value from workers. Sound familiar? “Luddism wasn’t a war on machines,” writes Byrne.10 It was a working-class movement, which understood the importance of confronting the technopolitics of industrial capitalism.


Luddism as policy.12


Ultimately unmaking means thinking bigger than just downgrading our smart toasters or detoxing from our smartphones. It is a method of reckoning with the “material foundations” and “form of society” created by digital capitalism.


Those that don’t actually enhance life and benefit society—which instantly excludes nearly all smart tech designed to expand data extraction and social control—should be unmade.


We’ve been taught to think about innovation as a result of unexpected ideas and great men, which then produce technoscientific advances that improve human life.


When I say we need to take control of innovation, I mean we need to democratize innovation by giving more people more power to influence how, why, and for what purpose new technology is created.


The problem is that the current model socializes risks and privatizes rewards: governments spend public money and bear the risks of investment, while businesses claim ownership and reap the benefits. Silicon Valley, for example, would basically not exist if not for the long list of technoscientific discoveries funded by government agencies, not to mention the cash injections and business deals that prop up iconic companies like SpaceX and Amazon. Indeed, some of the most exalted entrepreneurs like Elon Musk are, in many respects, glorified government contractors.


An institution like a data repository would be a promising step toward preventing private interests from controlling the ability to build and benefit from data-driven tech. For example, instead of a company like Uber—monopolistic, mercenary, and merciless—gathering reams of data about people’s mobility for its own benefit and then charging local governments for access to its insights, a repository could ensure the data is used to enhance and expand public transportation.


collectivize data,


Through public ownership, the power of data can fuel more socially beneficial innovation, its value can be more equally distributed, and its potential can be more fully unlocked.


The three tactics outlined above—deconstructing capital, democratizing innovation, and demanding data—contribute in their own ways to the goal of seizing smartness to build a better society.


Resisting, redefining, and redesigning the smart society will be difficult, but it is necessary. Thatcher’s slogan that “there is no alternative” was a declaration of triumph. Silicon Valley is already hanging the “mission accomplished” banner. It’s our job to show that its celebration is premature.

Marco Herrera Solar. Last modified: July 03, 2024. Sitio hecho con Franklin.jl y Julia programming language.