Why surveillance capitalists resent human freedom and autonomy.
The recent Netflix film The Social Dilemma presented as a stunning confessional. Ex-executives of Google, Facebook, Twitter, and other companies, each aired their guilty conscience on air, issuing a warning: the social media platforms they helped to create present a huge danger to collective society and individual wellbeing. As ex-Facebook executive and venture capitalist, Chamath Palihapitiya said, he helped invent “tools that are ripping apart the social fabric of how society works”. Or Facebook co-creator, Sean Parker, who claims that Facebook is designed to exploit human vulnerability. Who can we, in good faith, still maintain the delusion that it’s this club of big-tech defectors who have it wrong, and that those still espousing the infantile ideals of tech utopia have it right? In fact, The Social Dilemma may understate the breadth and pervasiveness of the problem. The concentration of power in these companies alone is concerning. Zuckerberg owns a majority share of Facebook and so personally controls the information flows of the biggest communications network in the world. This is an event unprecedented in human history. This level of power and control is already being exercised in alarming ways. Surveillance capitalism — the modus operandi of Google, Facebook, Microsoft, Amazon, and others — is a novel attack on human freedom and individual autonomy.
Before we explore their mode of operation, what about the deleterious effects of the medium of internet itself? New media technologies are not non-specific amplifiers, given mediums influence our minds in very different ways. The internet effects the way we read, think, and remember. All these are skills we should covet if we hope to pursue justice and understanding, and to meet the challenges of our time. Research by Ziming Liu from San Jose State University found that the internet promotes browsing, scanning, skimming, and non-linear reading. According to Liu “the digital environment tends to encourage people to explore many topics extensively, but at a more superficial level”, and that “hyperlinks distract people from reading and thinking deeply.” By doing so we risk neglecting and effectively decaying (by virtue of our neuroplasticity) our capacities to personally understand subjects in depth, and to critically integrate new information. Simply having your phone in your field of vision has been shown to decrease cognitive and mnemonic performance.Tech moguls proudly declare the apparent “outsourcing” of human memory as a progressive triumph which increases access to information whilst unburdening our biological memory. In their world the mind is the brain, and the brain is a supercomputer. In this sense, and as internet critic Bob Carr says, they are “being misled by a metaphor”. The brain is not a computer, it is an analogue, organic system. So, human memories are alive, fluid. Computer ‘memories’ (which really stretches the metaphor) are dead, static data. Human memories aren’t digital facts, they are made up of complex, non-binary schemas. Evoking our biological memories is deeply entangled with how we understand ideas and how they relate to our lived experience, its entangled with how we think. And of course, unlike our digital ‘brains’, we have both short-term and long-term memories. Consolidating short-term memories into the long-term bank takes time and reflection. So, it’s our long-term memory, and the skills associated with feeding it, which are left sclerotic by our internet use. Our devices and platforms are designed to maintain our attention with constant waves of fresh information. They are deliberately designed to be as addictive as possible. Today, the competition for our attention is a lucrative market. Our attention is sought by a plethora of apps, sites, and platforms all at once, every day. The net result is that at times we’re constantly distracted, which makes engaging in deep reading, thinking, synthesis, and creative activities harder. Despite Plato’s warnings about the effect the technology of writing would have on human thought, writing came to supplement human memory, whereas Carr describes the internet as “a technology of forgetfulness”. The digital environment overloads our short-term working-memory. The information flows exceed our attentional capacity and refuse to allow that precious respite we need to consolidate long-term memories and cultivate personal understanding.
Today, our digital environment goes beyond the implicit impacts of the medium of the internet. And not just in the sense that the speed of the media cycle encourages reactivity, tribalism, and cultural warfare — though it does — but in that surveillance capitalists want to explicitly oppress deep thinking, reflection, free-choice, and individual autonomy. This is precisely because all of these things threaten their financial imperatives. These imperatives, and the market outcomes sought by surveillance capitalists are stunningly elucidated by Harvard’s Shoshana Zuboff’s book, Surveillance Capitalism.
Although the motto “If you’re not paying, you are the product” prompts a useful shift in perspective, this predictive capacity is the real product. From our behaviour, emotions, experiences, personalities, habits, and heartbreaks, ‘predictive products’ are derived. These are incredibly attractive for 3rd party customers seeking highly determined demand. First it was targeting advertising. In 2000 Google caved on its earlier resistance to the advertiser model amid the ‘dotcom boom’ and mounting pressure from shareholders. It adopted the model, and within 4 years Google’s AdSense was accruing annual revenues of more than $10billion. The birth of surveillance capitalism. However, surveillance capitalists have widened their customer base over time. For example, facial recognition technology, which is programmed by the photos we provide to surveillance platforms, is sold down a chain of 3rd party buyers. In one case it has assisted military operations to track imprisoned Uighurs in Chinese “re-education camps”.
Their early levels of predictive certainty were just the beginning. Surveillance capitalists want to more accurately determine outcomes by modifying human behaviour en masse. Zuboff claims: “surveillance capitalists declare their right to modify others’ behaviour for profit according to methods that bypass human awareness, individual decision rights…and autonomy and self-determination.” So now, in addition to highly accurately predicting what we’ll think, feel, and do, these companies now seek to (without consent) modify what you think, feel, and do, all in alignment with the preferred outcomes of their customers. Facebook researchers themselves have boasted as early as 2012, in an article published in Nature, that they have successfully influenced “the real-world voting behaviour of millions of people” in the US midterm elections. More recently, Pokémon Go, originally developed by Google, was herding users into franchises who were paying to have more Pokémon featured in their stores. Companies were paying surveillance capitalists to direct unsuspecting gamers into their businesses, providing lucrative foot traffic, and shedding uncertainty of demand. Remind me just who the ‘users’ are in all this? To achieve all this seamless efficiency a great deal of our individual freedom, consent, and privacy are sacrificed. Without uncertainty how can there be freedom? With so much of our society built around these platforms and considering their foreclosure of alternative mediums (their monopolisation), are we really consenting? When the most precious and personal parts of ourselves are claimed as behavioural surplus, what of our privacy - what of our inner lives?
Comments
Post a Comment