Connect with Adolescent
Close%20button 2

Lithium Foucault, data, and human intimacy in the age of surveillance capitalism

Mar. 8, 2021
Avatar image0 2  1 .jpegeae7950e b82c 41be 9aac 28e840ccdfc2

What is really mine anymore?

On any given day, I’m letting little bits and pieces of my personal life become available to someone—or something—else. I don’t read the terms and conditions. I give in and press buttons that half-heartedly beg for my permission to collect data about me. 

Around the time of writing this, I decided to request a copy of my data from Instagram. Two days later, I received a 393-megabyte folder of hundreds of different files, tracking my digital footprint as far back as 2013. It didn’t take long to uncover an embarrassing argument I got into with strangers in my freshman year of high school. 

It felt strange, as I anticipated it would. It’s difficult to wrap your head around the fact that the whole of your teenage life online—every comment, post, direct message, and a host of other interactions—was discreetly tracked and archived by an inanimate data-collecting entity. 

As an overly sentimental person (sometimes to the point of embarrassment), being able to call something my own is something I hold very near and dear to my heart. I’ve kept every card with at least a sentence written in it enclosed in a large shoebox in my closet. I have an incessant desire to document every small joy or moment I can call mine in writing, so as to immortalize it. And despite the “iPhone storage full” notifications I get on a weekly basis at minimum, I have a stash of conversations from my freshman year of high school. 

Increasingly, though, things don’t feel like they’re entirely my own. Part of this is voluntary; I find joy in sharing moments with friends, even strangers, on my social media. But it’s two-fold; if I want to exist in these digital spaces that allow me to share my intimate, cherished moments, I surrender them to others, and whatever data-collecting methods—from monitored search engine queries to behavioral biometrics—are listed deep in the terms and conditions. 

The effects of diminishing privacy online go beyond data collection, though. Social media platforms and the algorithms embedded within them have not only altered my existing relationships, but created new ones. The Tinder algorithm has led me to sushi dates and even friendships. The calls for zine submissions advertised to me on Instagram have led to an abundance of creative connections. Yet the ultimate goal of these algorithms is not merely to foster bonds but to monetize the human experience through data collection—a phenomenon which has come to be known as surveillance capitalism. 

In an interview with The Harvard Gazette, Shoshana Zuboff, a professor emerita at Harvard and the author of The Age of Surveillance Capitalism, defined surveillance capitalism as “the unilateral claiming of private human experience as free raw material for translation into behavioral data.” This shift from private to public has an immense effect on our relationships with ourselves and others; our shared experiences, whether through photos, messages, hashtags, or location data, become material for data monetization. 

To be clear, data monetization isn’t just a matter of profit—its optimization often comes at the cost of user privacy. The more data that algorithms can mine from users, the more they can predict our future purchasing actions. 

Another problem that arises from the structures of data monetization has to do with the fact that most data-collecting entities are privatized, with far less oversight than state surveillance—and because, as John Naughton wrote in an article for The Guardian, there is a “logic of accumulation implicit in surveillance capitalism,” it certainly won’t be an easy fix. 

At the same time, dataveillance can also impact how we choose to interact with one another online. Broadcasting aspects of my personal life means much of what I formerly considered private is no longer knowledge limited to my close circle, journals, and the occasional word of mouth—it’s available to whoever lurks my tagged photos, followed hashtags, and Spotify playlists. 

With this pervading, almost voyeuristic digital surveillance comes the blurring of the line between private and public life. As my personal life has become more public over the years, I’ve been left to wonder whether this growing inability to distinguish what is and isn’t private to me is of benefit at all. 

Digital or not, surveillance can be studied through the panopticon framework, a concept originally dictated by the utilitarian philosopher Jeremy Bentham. Architecturally, the panopticon functioned as a blueprint for a prison: an observation tower with tinted glass would be placed in the center, surrounded by cells so that the guard could keep an eye on all inmates at all times. The prisoners would be unaware of whether or not they were being watched. 

In Discipline and Punish, Michel Foucault refined the concept of the panopticon to be more applicable to the greater context of social discipline. One of its main pillars is efficiency; by streamlining many forms of surveillance into one mechanism, power can be exercised seamlessly, and at a low cost. 

Foucault didn’t live to witness the birth of social media and the rise of dataveillance that followed. Yet this “seeing machine”—the panopticon—possesses disturbing similarities. At its core, it maximizes the intensity of surveillance power by doing the least amount of work possible. Social media platforms function, then, as the massive, centralized entity through which surveillance power is wielded. 

In my romantic relationships, digital surveillance has manifested in a number of ways. I am reminded again of the ways Foucault discusses surveillance power as “visible, but unverifiable.” It’s not merely algorithms and dataveillance that have shaped the way we choose to present online; our own, often mutual “cyberstalking” plays a role too. At any moment, just about anyone could be lurking my profile. 

In the dating world, seemingly endless information, from music taste to political views, is available at our fingertips. While performing “background checks” has saved me from a few red flags, I often find out more than I want to. I’d reckon most of us have fallen down the mortifying rabbit hole that leads us to our romantic interest’s sister’s boyfriend’s mother (however much we hate to admit it). Other times, though, we uncover deeper remnants of their digital footprint; the blatantly misogynistic tweet they made in 2013 differs from the progressive, feminist persona they wear now. With more aspects of our lives online and far less to disclose, it seems as though the days of romantic mystery are dwindling.  

How does this affect our ability to be vulnerable with one another, then? How do we disclose our real selves when the first impression someone has is often of our hyperreal, online selves?

In an article for Counterpunch, Loaui Rahal wrote that human intimacy declines as mass surveillance increases. More specifically, “when nothing about us is private when everything is known to everyone around us, there will be nothing left to disclose and intimate self-disclosure will become impossible.” 

There’s also a “relationship” (if you can even call it one) between me and another entity that I haven’t discussed. It knows when I’m up at 3 AM playing design games on my phone. It recommends grad schools in the UK between Instagram Stories. It recommends zines I should submit to when I’m inspired, meal-kit delivery services when I’m lazy, and online therapy platforms when I’ve browsed one too many anxiety blog posts. 

The relationship between me and algorithms—one that I tacitly consent to, but largely forget about—is perhaps one of the most intimate of all; the more it discovers about my online habits, the more personalized its suggestions become. The location data of my weekday coffee run becomes an opportunity to target me with home-brewing kits on Amazon. My Spotify playlists increasingly reflect the recommendations of algorithms, rather than the music my friends suggest. All of it becomes a calculus, predicting how my personal data can be translated into future, profitable action. 

The invisible, pervasive structures of surveillance capitalism seep into every corner of our lives online. I wish I could offer up a slice of hope or advocate for some marginal solution to the conditions these structures create. But one of the most terrifying aspects of the panopticon model is that it’s designed to reproduce power relations, making it an inescapable part of social reality. Our dependency on these systems and technologies is no accident; the closer they get to us, the easier it is to monetize the human experience. The more we share, the less becomes ours. 

Illustration by Eutalia De la Paz