Chapter 3 – Intro and table of contents – Chapter 5
Chapter 4 – The Moat Around the Castle
Human natural resources
Zuboff goes back to economic philosophy, more specifically to Polanyi, Marx, and Aarendt, to look at “commodity fictions”: when human life became “labor”, when nature became “land”, and when exchange became “money.” This is dubbed by Polanyi as the “great transformation” to a self-regulating market economy, and is further elaborated by Aarendt as an ongoing cycle which is deemed to repeat. Zuboff then expands that logic to a 4th fiction: surveillance capitalism rose when human experience became “behavioral data.” This whole argument by Zuboff is surprising, since it goes against a major point of the previous chapter, which stated that surveillance capitalism was not a natural evolution of capitalism but had to be willed out of thin air by individual entrepreneurs at a special time…
Fast forward to 2018, the robbery of human experience by capitalists to enable a surveillance system has been accepted by all, seemingly very little challenged by institutions. But Zuboff argues that this was not by chance; rather, Google and co. used a systematic method to obscure their actions and get buy-in from the powers in place. Through specific corporate governance models, post-9/11 American politics, and the proactive construction of fortifications, they have been able to fend off all attacks so far. This chapter deals with those specific defenses that were built once the model was established.
The cry freedom strategy
The corporate structure put together at IPO time enabled Google’s founders to retain majority of the control over the company’s strategic decisions, by giving them shares that had 10x the voting rights that other shares had. Then in 2015, a third class of shares was created, with 0 voting rides, preventing further dilution of the founder’s power. This model proved successful, with Facebook and many other tech firms following in the same tracks (15% of all American IPOs had this model). Founders then used this model where they had free rein from shareholders to pursue an aggressive strategy of buying services/products that could give them large amounts of behavioral data even without turning a profit (YouTube for Google, WhatsApp for Facebook, etc…).
At the same time, tech founders also believed that they were operating in a space where they were free of governments’ intervention. Zuboff draws again to Aarendt’s analysis of how capitalism created its own realities in the 19th century “dark continents,” where the Law had no reach yet. Tech companies thus decided to operate in a cyberspace where they were “moving much faster than government could.” And when governments started to move faster, surveillance capitalists started lobbying at great length against regulation on privacy, among other things, as “such laws are an existential threat to the frictionless flow of behavioral surplus.” This clear contempt for the Law was very similar to the Robber Barons of late 19th century, observes Zuboff, with similar claims of “survival of the fittest/winner takes all.”
Shelter: the Neoliberal legacy
US politics, as a reaction to fears of collectivism and dictatorship brought about by WWII and the Cold War, took a turn away from regulation in the second half of the XXth century, towards “self-regulation” of companies. Tech companies used this context to make sure regulation around privacy and data collection did not come about. Furthermore, a recent tendency of the American judicial system to consider the First Amendment through a “ conservative-libertarian” lens (much “laissez-faire” in that regard) has enabled corporations to further fend off government intervention in their affairs. And the last nail in the coffin was Section 230, passed in 1996, which enables online companies to be seen as intermediaries rather than publishers, in an attempt to try to encourage them to moderate content. But Zuboff states that surveillance capitalism’s logic makes moderation undesirable, since it would mean destroying (or preventing the creation) of precious behavioral surplus…
Shelter: Surveillance Exceptionalism
Zuboff highlights a 2000 FTC report that recommended more oversight on online marketplaces, as self-regulation was not sufficiently successful. However, the 9/11 attacks changed the narrative in the US, and privacy took the back seat to let security take all the light: more data collection and cross-referencing was desirable, not frowned-upon. The “war on terror” instituted a state of exception, where the intelligence community’s powers and reach were drastically increased, including its closeness to Google et al, says Zuboff. Based on a similar need for data collection and processing, tech and intelligence walked hand in hand. For example, as early as 2002, Zuboff finds “big data” related research projects in the Pentagon. And more tellingly, Google provided “Search technology” to both the CIA and the NSA.
Zuboff then proceeds to list what could be seen as “weak signals” of the interpenetration of Google and the intelligence community: the company invested in one startup that the Pentagon had also backed, its employees went to the same conferences as some intel analysts, some analysts devised data searching systems that resembled Google Search… More convincingly, she then quotes several former high ranking leaders in the intelligence field who were at the time wishing a tightening of their relationship with Google. Technology companies’ data surveillance capabilities were coveted, not contested.
Fortifications
Beyond those politico-economic trends, Zuboff goes on to highlight how Google actively worked to protect the supply of behavioral surplus, through 4 pillars: helping political campaigns achieve some degree of behavioral prediction, lobbying, exchanging personnel with the Obama administration, and growing its influence on the academic community. The common theme here being that Google is bigger and badder than other American firms. Hard to know however how that compares across industries, countries, etc… Nonetheless, the connection between some parts of the administration and Google definitely exists: there are efforts from Google to support its messages and voice in the political and academic fields, sometimes in an aggressive manner.