Chapter 7 – Intro and table of contents – Chapter 9
Chapter 8 – Rendition: From Experience to Data
Terms of Sur-render
In the extraction of human behavioral data, an important step is the actual rendition of experience into data: the specific processing of human experience, originally given, that becomes data to be used by surveillance capitalists. Zuboff argues that in most cases, this rendition also conveys the idea that users have surrendered to the tech companies’ plans to extract all the data possible from all types of human experiences: if they do not accept that their data are processed, monetized, sold, etc… they will usually have to face degraded product features. Users would be sufficiently warned of this “deal with the devil” by the convoluted, long, complex privacy policies of said products… if they ever read them! But most don’t.
The examples given by Zuboff are all in the Internet of Things area, so it’s easy to consider that any consumer is free to not engage in buying these products. But she notes that more and more objects are labelled “smart”, for the purpose of collecting that tiniest extra amount of lived experience, to transform it into data. And the devices don’t limit themselves to collecting the behavioral surplus: they will eventually be used to display personalized advertising deriving from those data.
Generally, the author makes the point that beyond discussions of opt-in, opt-out, or consent, “rendition is typically unauthorized, unilateral, gluttonous, secret and brazen.” Its role in surveillance capitalism is to enable the economies of scope that were discussed in the previous chapter. And first of all, it starts with the widening of human experiences that are used as sources of behavioral data.
Body Rendition
The first trojan horse of surveillance capitalists is the phone in your pocket: geolocation data is extremely valuable to advertisers of all kinds, who are encouraged to “map the daily patterns” of a “target audience” in order to “intercept people in their daily routines with brand and promotional messages.” 90% of Americans do not turn off their location services. But even if they do, companies like Google still have other means of locating them (wifi networks, cell tower triangulations, etc…). And once they have the data, many data anonymization techniques are not robust enough against retro-engineering. This geolocation data is also useful beyond the straightforward goal of maximizing advertising revenues linked to a specific user: it can be used in aggregate to predict population-level insights and predictions.
After knowing where the user is, the next step of body rendition is to know how the user feels and what emotions they are experiencing. This is where wearables come in, with their market set to explode in coming years. In the health industry for example, whereas original modeling for wearable technology only involved the patient and their healthcare providers, there is now an intermediary, a tech company that derives behavioral surplus in the process of providing a service. Actually, way more than one intermediary: in 2016, there were over 100,000 mobile health apps on Android and iOS combined. And no one is actively regulating those apps, as of publication: the FTC and the FDA both only have voluntary “guidelines” and “best practices” for health-related apps.
Absent strong coercitive methods to make surveillance capitalists respect users’ privacy, many of them are very liberal with how they do use health data, often without properly notifying users. Some studies have shown that privacy policies, when present, often misrepresent the companies’ use of user health data. And the last frontier is biometrics, which have exploded in recent years. Surveillance capitalists, Facebook in the lead, have argued that it should not be necessary to obtain consent from people before facial recognition technology is used to identify them, for example. Extraction and rendition of humans has come full circle.