4-Part Series: The Digital Paradox: Why We Hate Our Devices But Can't Live Without Them
Part 2: Surveillance Capitalism Tightens Its Grip
I'm irritated.
I go to read the morning news with my coffee when another aggressive pop-up interrupts: "Accept all cookies or customize settings?" Ugh. I click "accept all" because I just want to read the news, not techno-legal word salad. But unease lingers. Asking for consent usually signals risk.
[This is Part 2 of a series exploring our paradoxical relationship with digital technology. Read Part 1 here: 'A 1960s philosopher spots the seeds of today's digital paradox']
With these cookie notices, it's not a risk of physical harm I’m consenting to, but the erosion of privacy, the feeling of constantly being monitored. On some level, I know I'm allowing websites to track my behavior and preferences, feeding data to personalized ads and systems designed to shape what I might see, click, and even believe.
But like everyone else, I click "accept" and move on. With each click of acceptance, I surrender more control and anxiety accumulates in some corner of my brain I avoid visiting too often.
The platform owners aren't just after my money - they're after my obedient capitulation. Each click of acceptance weakens my ability to resist or imagine alternatives. I become exactly what they want: a participant in my own manipulation, generating data and accepting their algorithmic suggestions as inevitable. They slowly reshape what feels possible.
Seeking escape, I open Facebook to scroll mindlessly. My sunset photo from last night has 268 likes! The dopamine hit feels good. I want more until I recognize it’s an engineered reward that leaves me hollow and disconnected.
My thoughtful post from four days ago expressing hope for a fair election sits unviewed by any of my friends who care deeply about politics. Tech billionaires curate my isolation. They control what others see and ensure I blame myself for every ignored post. Each little rejection triggers ancient survival anxiety circuits - our rational mind can't outrace the primal fear of being cast out.
Just another morning negotiating with my digital life. No wonder the U.S. Surgeon General recently warned of how social technologies paradoxically leave us more anxious and isolated even as they promise connection.
The Birth of Behavioral Surplus
What began as collecting data to improve their search engine became something more profitable: mining our pauses and query revisions for clues to inform behavioral predictions.
Shoshana Zuboff named this new reality surveillance capitalism. Where TV ads once tried to influence everyone the same way, today's platforms study and shape each person uniquely.
Google pioneered this model. What began as collecting data to improve their search engine became something more profitable: mining our pauses and query revisions for clues to inform behavioral predictions. This 'behavioral surplus' became more valuable than the search service itself. Google wasn't just helping us find information - they were building a system to predict and shape human behavior at scale.
Facebook took this model further by turning our social connections into prediction engines. The platform doesn't just know what we've done - it knows what our friends do, what people like us do, and how to shape what we'll do next. These engines do more than predict shopping. They aim to trigger emotions that maximize screen time, pulling us away from direct experience in the real world.
Today, 41% of Americans report being online "almost constantly," with that number at 62% among young adults. Teens who use social media most heavily report twice the rate of poor mental health. The average teen now spends almost five hours daily on apps like YouTube, TikTok, and Instagram.
We fret about stranger danger and monitor playground interactions, but we've gone numb to the daily invasion of our kids' minds. A stranger photographing children gets arrested. A researcher studying kids without consent loses their job. Meanwhile tech billionaires harvest every detail of our children's digital lives, from their vulnerabilities to their conversations. These digital predators intentionally target children, knowing young brains are easier to manipulate. Campaign cash keeps legislators looking the other way.
Prediction Becomes Power
This systematic tightening of control now shapes our shared reality, not just our individual behavior.
Amazon was an early pioneer. Tracking purchases to predict shopping became actively shaping what you buy. Their algorithms do more than recommend. They shape preferences through artificial scarcity and manipulated pricing. Convenience masks control.
TikTok takes this beyond commerce. Its algorithms learn exactly which emotional triggers keep each user scrolling, creating addictive cycles of anxiety and relief. Despite studies showing higher rates of depression among teens who spend more time scrolling, they keep optimizing for emotional addiction over wellbeing. The platform's reward patterns echo abusers' favorite trick - intermittent hits of validation until you can't walk away.
This same logic now drives how Mark Zuckerberg's Meta and Sundar Pichai's Google shape our digital lives. Each feature of their platforms exists to keep you captive to their algorithms. "Free" platforms and services are bait. The real product is your captured attention and your future behavior the companies seek to mold on behalf of the highest bidder. Car companies and cancer cure scammers shop at the same store where our future behaviors are the inventory.
This systematic tightening of control now shapes our shared reality, not just our individual behavior. My election post disappeared into the void. Not by accident. These companies control what spreads and what stays buried, shaping what we know and believe. Zuboff calls this 'epistemic power' - unprecedented control over how society makes sense of the world. When the same companies that profit from predicting our behavior also control how we understand their power, they achieve total influence.
The Great Digital Divide
The real threat to democracy isn't at our borders - it's in our browsers.
Tech giants run thousands of experiments daily across billions of users. They study us while feeding us cat videos. Zuboff calls it the 'epistemic divide': They accumulate vast knowledge about our behavior while we know less and less about how they use this knowledge to shape us. Each click widens this gap. They map our minds while we stumble in the dark.
Platform owners’ algorithms also make lies travel faster than truth - MIT found false news spreads six times faster on Twitter. Facebook's whistleblower exposed how their systems profit from content that divides us. When platforms reward outrage over facts, we lose our ability to find common ground and agree on basic facts.
You might wonder why we don't simply regulate these platforms like we did utilities in the past. Our democratic institutions crawl while tech platforms sprint ahead, reshaping society faster than we can regulate them. When the EU tightened privacy rules, Facebook simply moved its data processing elsewhere, dodging the regulations. When legislators have questioned Google's search rankings, the company’s owner has pointed to the complexity and opacity of its algorithms, which even its own engineers sometimes struggle to explain.
Tech billionaires buy political influence outright - acquiring major news outlets like Bezos's Washington Post and funding campaigns like Musk's millions for Trump. Traditional regulatory approaches struggle against this combination of technological control and concentrated private wealth.
Presidential candidates spend endless hours arguing about the supposed threats from immigrants and transgender teens while tech billionaires quietly consolidate their control over our shared reality. The real threat to democracy isn't at our borders - it's in our browsers. And when tech platform billionaires like Musk are bankrolling Trump’s political campaign, who will constrain them if Trump is put in office?
Infrastructure Becomes Control
When these companies control both our connections and our discourse, they head off protective strategies at the individual and collective levels.
'Just don't use it' sounds reasonable until you try. Imagine telling someone to opt out of electricity or telephone networks in 1970. That's where we are with digital platforms - they've become the infrastructure of modern life. Try finding work without LinkedIn or keeping up with your community without Facebook. We need these platforms for managing everything from our kids’ schooling to our taxes. These digital systems have become our environment, as invisible to us as water is to fish. The same platforms we distrust have become essential.
These platforms aren't just ubiquitous essential infrastructure - they're engineered to demand more and more of our attention. So attempting to protect ourselves with privacy settings and digital detoxes becomes a cruel joke. Even meditation apps, promising a safe harbor from digital manipulation a few minutes every morning, feed data to the same attention marketplace they pretend to fight. We can't opt out without cutting ourselves off from modern life, but every time we opt in, we face systems designed to pull us deeper. And when we look beyond personal protective solutions to democratic safeguards, we find the same futility.
Private companies now control the channels through which democracy itself functions - our news, discourse, and ability to organize. Built on a backbone of publicly funded infrastructure like ARPANET and bolstered by government-funded innovations in digital communication, this massive study of human nature now serves private profit rather than the public good. When these companies control both our connections and our discourse, they head off protective strategies at the individual and collective levels. But in that very frustration lies a seed of hope.
Breaking the Spell: From Personal Shame to Collective Insight
When we understand how this system profits from the isolation and shame it is designed to produce, it’s useful to reach for righteous anger.
Some mornings I pause before clicking "accept" on those cookie notices, and I feel something beyond the usual unease - I feel anger rising. I feel used. And when I think about how they're deliberately targeting children, that anger transforms into moral outrage.
Shame hits when screen time reports measure participation in life lost to endless scrolling. We promise to do better. Then we surrender to notifications again. But that shame spiral after compulsive scrolling isn't a personal failing - it's a carefully engineered product. We feel alone in our struggles precisely because isolation makes us easier to influence and control. Periodically inducing shame is part of the cycle of abuse that keeps us hooked.
When we understand how this system profits from the isolation and shame it is designed to produce, it’s useful to reach for righteous anger. When we embrace that anger and discuss it with others, it mobilizes energy to get unstuck and helps us imagine real alternatives.
Thanks for reading! Please like, comment and share this post.