The Ghost in the Machine
There is something deeply unsettling about walking through a shopping mall and receiving an advertisement on your phone for the exact pair of shoes you glanced at three weeks ago. You never searched for them. Never mentioned them to anyone. Yet somehow, the machine knew. This creeping sensation, this feeling that invisible eyes track your every move and thought, sits at the heart of both cyberpunk fiction and the economic system that now dominates our digital lives.
Cyberpunk emerged in the early 1980s as a literary movement that imagined futures where high technology met low life, where gleaming corporate towers cast long shadows over desperate street hustlers jacking into cyberspace to survive. These weren’t the gleaming utopias of earlier science fiction. No United Federation of Planets here. Instead, writers like William Gibson, Bruce Sterling, and Pat Cadigan envisioned worlds where multinational corporations had eclipsed nation states, where personal data had become the most valuable commodity, and where the line between human and machine had blurred beyond recognition.
What’s remarkable is how accurately these authors predicted our present moment. They wrote about surveillance before the smartphone. They imagined data harvesting before Facebook existed. They warned about corporate control of information before Google became a verb.
Surveillance capitalism, a term coined by scholar Shoshana Zuboff, describes an economic system built on the extraction and exploitation of human behavioral data. It treats human experience as raw material to be mined, processed, and sold. Every click, every search, every purchase, every pause on a webpage generates data that companies harvest, analyze, and monetize. The product isn’t the service. The product is you.
The overlap between cyberpunk’s fictional warnings and surveillance capitalism’s real world practices is no coincidence. Cyberpunk authors were paying attention to the early signs of digital capitalism. They extrapolated from existing trends and their extrapolations proved frighteningly prescient.
Prophets of the Wired World
William Gibson never owned a computer when he wrote Neuromancer in 1984. He typed the manuscript on a manual typewriter, a detail that seems almost impossible given how thoroughly the novel predicted the digital age. Gibson invented the term cyberspace. He imagined global computer networks before the internet went public. He envisioned hackers penetrating corporate databases and virtual reality as a space for commerce and conflict.
More importantly for our purposes, Gibson understood that in a networked world, information would become power. His megacorporations don’t conquer through military might. They dominate through data control, through owning the networks and the information flowing through them. The Tessier Ashpool corporation in Neuromancer doesn’t need armies. It has something better. It has total information awareness.
Gibson’s vision of corporate power proved eerily accurate. Today’s tech giants don’t maintain private militaries, but they wield influence that rivals or exceeds many nation states. Google processes over 8.5 billion searches daily. Facebook, now Meta, counts nearly three billion monthly active users. Amazon Web Services hosts roughly a third of all cloud computing. These aren’t just companies. They’re infrastructure. They’re the terrain on which modern life happens.
Philip K Dick, though he wrote before the cyberpunk movement had a name, anticipated many of its concerns. His 1969 novel Ubik imagined a world saturated with advertising that had become inescapable, personalized, and intrusive. Doors that won’t open until you pay them. Coffee machines that demand money before dispensing your morning fix. Everything monitored, everything monetized, everything tracking your usage patterns.
The advertising in Ubik doesn’t just sell products. It learns. It adapts. It predicts what you want before you know you want it. Sound familiar? Amazon’s recommendation engine, Spotify’s algorithmic playlists, TikTok’s eerily accurate For You page. Dick imagined ambient, intelligent advertising half a century before it became our daily reality.
What Dick understood, and what cyberpunk writers after him would elaborate, is that surveillance and capitalism aren’t separate forces that happen to overlap. They’re symbiotic. Each feeds the other. Capitalism requires surveillance to maximize profits. Surveillance requires capitalism to fund its infrastructure. The result is a system that watches us to sell us things, and sells us things to justify watching us.
The Architecture of Control
Cyberpunk fiction is obsessed with architecture. Not just physical architecture, though the genre’s cramped urban landscapes and towering corporate headquarters are iconic. It’s equally obsessed with information architecture, with the structures that determine how data flows, who can access it, and what power derives from that access.
In Gibson’s Sprawl trilogy, the matrix isn’t just a communication network. It’s a geography. Characters navigate it like a city, with wealthy corporate nodes standing like gleaming downtown towers while independent hackers skulk through the equivalent of back alleys. The structure of information space reflects and reinforces the structure of power.
This insight proved remarkably accurate. The internet is not, despite early utopian rhetoric, a level playing field. It’s a terrain shaped by money and power. The websites you can easily find are the ones that can afford search engine optimization. The voices you hear are the ones amplified by algorithmic systems designed to maximize engagement, which usually means maximizing outrage or addiction. The platforms that mediate your social connections are owned by a handful of corporations with their own interests and agendas.
Neal Stephenson’s Snow Crash, published in 1992, pushed this architectural thinking further. His Metaverse anticipated virtual reality social spaces two decades before Mark Zuckerberg rebranded his company after the concept. But Stephenson understood something that contemporary tech boosters often miss. These spaces wouldn’t be neutral. They would be owned. They would be monetized. Access would be stratified by wealth.
In Snow Crash, rich characters appear as detailed, beautiful avatars while poor characters are rendered as grainy, low resolution figures, perpetually marked as lesser by the system itself. The virtual world doesn’t escape the inequalities of the physical world. It reproduces and amplifies them.
This is precisely what we see in contemporary digital spaces. Social media platforms aren’t neutral pipes for communication. They’re designed environments optimized for surveillance and advertising. Their architectures encourage certain behaviors and discourage others. They fragment attention, encourage performance, and harvest the resulting data for profit.
The cyberpunk insight was that technology is never neutral. It embeds the values and interests of its creators. When those creators are profit driven corporations, the technology will serve profit driven ends.
Bodies and Data
One of cyberpunk’s most disturbing themes is its exploration of the body as data, as something that can be read, recorded, copied, and controlled. Characters in cyberpunk fiction routinely modify their bodies with technological implants. They store data in their heads, literally. They sell their memories, their skills, their sensory experiences as commodities.
Pat Cadigan’s Synners, published in 1991, examines what happens when the line between mind and network dissolves entirely. Characters can jack directly into the network, experiencing data as sensation and sensation as data. But this intimacy comes with vulnerability. A virus in the network becomes a virus in your brain. Corporate ownership of the network becomes corporate ownership of your thoughts.
This might have seemed fantastical in 1991. It seems less so now. Wearable fitness trackers monitor your heart rate, sleep patterns, and movement throughout the day. Smart home devices listen constantly for trigger words, recording and transmitting audio to corporate servers. Facial recognition systems identify you in public spaces. Predictive algorithms analyze your behavior patterns to anticipate your future actions.
We haven’t achieved the direct neural interface of cyberpunk fiction. Not yet, anyway, though Elon Musk’s Neuralink and similar projects are working on it. But we’ve achieved something arguably more insidious. We’ve created systems that read our bodies and minds through our behavior, through the data traces we constantly emit.
Every search query reveals something about your thoughts. Every purchase reveals something about your desires. Every pause on social media, every message, every location check in generates data points that, aggregated and analyzed, create a detailed portrait of who you are. Or rather, who the algorithm predicts you to be.
Surveillance capitalism doesn’t need to read your mind directly. It can infer your mental states from your behavior with remarkable accuracy. It knows when you’re pregnant before you’ve told anyone. It knows when you’re depressed, when you’re lonely, when you’re vulnerable to certain kinds of marketing.
The body as data. The self as information to be harvested. Cyberpunk saw this coming.
Resistance and Its Limits
Cyberpunk fiction is often criticized for political incoherence. Its protagonists are usually rebels, hackers and street samurai fighting against corporate control. But the genre rarely offers a coherent vision of what an alternative system might look like. The best the protagonists can typically achieve is survival, carving out small spaces of autonomy within a system they cannot fundamentally change.
This might seem like a failure of imagination. But it might also be realism. One of surveillance capitalism’s most effective tools is the difficulty of imagining alternatives. The system presents itself as natural, inevitable, the way things must be. You can’t opt out of digital life without accepting severe penalties. Try getting a job, maintaining friendships, or navigating a modern city without a smartphone. The infrastructure of surveillance is now the infrastructure of daily life.
Gibson’s hackers in Neuromancer don’t overthrow the corporations. They conduct heists, redistributing information and wealth in small quantities while the fundamental power structures remain unchanged. This feels depressingly accurate to our present moment. Individual acts of digital resistance, using VPNs, installing ad blockers, deleting Facebook, might protect you personally but do nothing to change the system.
Bruce Sterling’s Islands in the Net, published in 1988, examines this dynamic with more nuance than most cyberpunk. His protagonist works for a progressive corporation trying to do business ethically in a cutthroat world. But the novel shows how even well intentioned actors get coopted by systemic pressures. Individual virtue is not enough when the rules of the game are rigged.
Recent speculative fiction has tried to imagine more systemic resistance. Cory Doctorow’s novels, like Little Brother and its sequels, depict organized movements using technology against surveillance. His characters don’t just hide from surveillance. They actively fight it, using encryption and hacking as tools of political resistance.
But even Doctorow’s optimistic scenarios reveal the challenges. His protagonists are young, technically skilled, and willing to accept significant personal risk. They are, in other words, exceptional. Their tactics require expertise most people don’t have. The structural problem remains. Surveillance capitalism is backed by the resources of the world’s wealthiest corporations. Resistance is backed by the resources of individuals and small groups.
The Attention Economy
Cyberpunk fiction anticipated many aspects of surveillance capitalism, but one area where later speculative fiction has proven equally prescient is the attention economy. This is the recognition that in a world of information abundance, attention itself becomes the scarce resource that companies compete to capture and monetize.
M T Anderson’s Feed, published in 2002, imagines a future where teenagers have constant internet connections implanted directly in their brains. Advertising is everywhere, personalized and inescapable. Characters receive targeted promotions based on their mood, their location, their social network, their entire psychological profile.
The horror of Feed isn’t just the surveillance. It’s what the surveillance does to the characters’ minds. They can no longer concentrate. They can no longer think in sustained ways. Their attention has been so thoroughly colonized by commercial messages that they’ve lost the capacity for independent thought.
This feels painfully relevant to anyone who has watched teenagers, or adults for that matter, compulsively checking their phones, scrolling through infinite feeds, unable to sit with boredom or silence. The attention economy doesn’t just monitor us. It shapes us. It trains our minds to crave constant stimulation, to seek novelty over depth, to be perpetually distracted.
Social media platforms are explicitly designed to maximize engagement, which means maximizing time spent on platform, which means hijacking attention through whatever psychological levers prove effective. Outrage works. Fear works. Social validation works. The algorithm learns what hooks you and gives you more of it.
The result is a kind of mental environment that cyberpunk authors would recognize immediately. Overstimulated, fragmented, constantly switching between inputs. A consciousness optimized for data consumption rather than reflection or action.
Predictive Power
One of surveillance capitalism’s most troubling capabilities is prediction. By analyzing massive datasets of human behavior, companies can now predict individual actions with startling accuracy. What you’ll buy. Where you’ll go. What you’ll click. Who you’ll vote for.
Minority Report, both Philip K Dick’s original story and Steven Spielberg’s film adaptation, explores the dark implications of predictive power. In that world, precognitive mutants can foresee murders before they happen, allowing police to arrest people for crimes they haven’t yet committed. The ethical horror is obvious. You’re punishing people for what they might do, not what they have done.
We don’t have precognitive mutants. But we have something arguably more reliable: algorithms trained on billions of data points that can predict behavior with high probability. Predictive policing systems already operate in many cities, flagging neighborhoods and individuals as likely to commit crimes. Credit scoring systems predict your likelihood of defaulting on loans. Health insurance companies predict your likelihood of getting sick.
These predictions aren’t neutral observations. They shape the reality they claim to merely describe. If the algorithm predicts you’re high risk, you get denied the loan, which makes your financial situation worse, which confirms the algorithm’s assessment. Prediction becomes self fulfilling prophecy.
China’s social credit system represents an extreme version of this dynamic. Citizens accumulate scores based on their behavior, their purchases, their social connections. High scores grant access to better services and opportunities. Low scores result in restrictions and stigma. Behavior is monitored, predicted, and shaped through algorithmic reward and punishment.
Western commentators often describe China’s system as dystopian, which it is. But they sometimes fail to notice the surveillance infrastructure operating in their own countries. The mechanisms differ, but the logic is similar. Behavior is monitored. Data is harvested. Predictions are made. Incentives are adjusted.
Cyberpunk fiction understood that predictive power is a form of control. To predict is to preempt. To know what someone will do is to be able to stop them or manipulate them before they do it.
The Corporate State
One of cyberpunk’s most distinctive features is its replacement of nation states with corporations as the primary power structures. In Gibson’s future, governments still exist but they’ve become weak, ceremonial, or wholly owned subsidiaries of corporate interests. Real power lies with the zaibatsus, the megacorporations that control resources, information, and ultimately people.
This might have seemed exaggerated in the 1980s. It seems less so now. The world’s largest corporations command revenues exceeding the GDP of most countries. They employ more people than many national militaries. Their platforms mediate the communications of billions.
More importantly, they’ve acquired many functions traditionally associated with governments. They issue currencies, at least in cryptocurrency form. They maintain their own justice systems for disputes on their platforms. They conduct diplomacy with nation states, sometimes successfully resisting legal requirements they find inconvenient. They collect more data on citizens than any government bureaucracy ever managed.
Richard Morgan’s Altered Carbon series, beginning in 2002, depicts a future where corporations have essentially privatized everything, including government functions like policing and military force. The ultra wealthy have achieved effective immortality through technology, making them a permanent aristocracy utterly separated from the masses below.
The novella’s cynicism about democratic accountability feels increasingly relevant. When corporations can fund political campaigns, shape public discourse through platform control, and lobby for favorable regulations, the distinction between corporate and government power blurs. The elected official who depends on corporate money, the regulator who expects to work for the industry they regulate, the politician who defers to tech platforms they don’t understand, none of these are exactly the cyberpunk scenario, but the direction is unmistakable.
Surveillance capitalism represents a new kind of corporate power, one based not on physical resources or manufacturing capacity but on information dominance. The companies that control the data pipelines, the search engines, the social networks, the cloud infrastructure, these companies wield a form of power that earlier corporate giants never achieved. They don’t just sell you products. They mediate your reality.
The Normalization of Surveillance
Perhaps the most disturbing aspect of surveillance capitalism is how normal it has become. Cyberpunk fiction depicted surveillance as menacing, something to resist or at least fear. But in our actual world, people voluntarily install listening devices in their homes, carry tracking devices everywhere, and share intimate details of their lives on corporate platforms.
This normalization is partly explained by the convenience bargain. We accept surveillance because it comes bundled with services we genuinely want. The cost of avoiding Google is using worse search engines. The cost of avoiding Amazon is paying more and waiting longer. The cost of avoiding social media is social isolation.
But there’s something deeper happening too. We’ve internalized surveillance. We’ve become accustomed to being watched, evaluated, scored. We perform for audiences we know are watching. We curate our online presences as personal brands. We accept that our data is the price of digital existence.
Foucault’s concept of the panopticon, derived from Bentham’s prison design where inmates could be observed at any time without knowing when they were being watched, has become a common reference point for understanding surveillance. The power of the panopticon isn’t constant observation. It’s the knowledge that you might be observed, which causes you to internalize the observer’s gaze and police yourself.
Social media has perfected the panopticon. You know that anything you post might be seen by anyone, forever. This knowledge shapes what you say and how you say it. You become your own censor, anticipating judgment before it arrives.
Dave Eggers’ The Circle, published in 2013, depicts a company clearly modeled on Google, Facebook, and similar tech giants that promotes total transparency as a social good. Privacy is treated as shameful, something only people with things to hide would want. The protagonist becomes so immersed in the company’s culture that she eventually advocates for mandatory surveillance, having fully internalized the ideology.
The Circle is often criticized as heavy handed, its satire too obvious. But reality has caught up with and exceeded its warnings. People now voluntarily broadcast their lives in ways the novel’s scenario only dreamed of. Influencer culture celebrates constant visibility as a career path. Privacy concerns, when raised, are often dismissed as paranoia or Luddism.
Digital Selves and Algorithmic Identity
Cyberpunk fiction has always been fascinated by identity, by how technology changes our understanding of what it means to be a self. From the AI characters in Gibson’s novels who may or may not be conscious to the replicants of Blade Runner struggling with implanted memories, the genre asks what makes us who we are.
Surveillance capitalism poses its own identity questions. The data profiles companies build about us are, in a sense, versions of us. They’re models designed to predict our behavior, and they often know things about us that we don’t consciously know about ourselves. The algorithm might recognize your depression before you do, based on subtle shifts in your browsing patterns.
These algorithmic identities are not us, but they increasingly determine how we’re treated. Credit decisions, job applications, insurance rates, even what content you see online, these are shaped by algorithmic assessments of who you are. The model substitutes for the person.
Kazuo Ishiguro’s Klara and the Sun, published in 2021, explores this dynamic through the perspective of an artificial friend, a robot designed to observe and serve a human child. Klara’s understanding of her human comes from constant observation, from building models of behavior and emotion. She loves the girl she serves, but her love is necessarily filtered through her algorithmic nature.
The novel raises uncomfortable questions about what it means to be known by a system that observes you but cannot truly understand you. The algorithmic profiles that surveillance capitalism creates are Klaras of a sort, observers that generate predictions and assessments without genuine comprehension.
This matters because how we’re seen shapes who we become. If the algorithm categorizes you as high risk or low value, that categorization has real consequences. It closes doors, shapes opportunities, nudges you toward certain outcomes. The feedback loop between algorithmic identity and lived identity is subtle but profound.
Possible Futures
Cyberpunk emerged from specific historical circumstances: the Reagan era, the rise of Japanese corporate power, the first generation of personal computers. Its vision of the future was shaped by the anxieties and technologies of its moment. Some of its predictions proved accurate. Others did not.
We got the surveillance. We got the corporate power. We got the data harvesting and the prediction engines. But we didn’t get the cybernetic implants, at least not yet. We didn’t get the virtual reality that replaces physical life. The metaverse, despite Mark Zuckerberg’s billions, remains disappointing.
More importantly, we didn’t get the aesthetic. Cyberpunk imagined futures that looked cool, all neon and chrome and leather. Our surveillance capitalism is beige. It’s corporate campuses with free snacks. It’s friendly interfaces designed to make extraction feel like connection. The dystopia came, but it came smiling, offering convenience, promising to make our lives easier.
This might be why cyberpunk’s warnings went unheeded. The genre gave us dystopia we could recognize and fear. What we got was dystopia we didn’t even notice arriving, so thoroughly had it disguised itself as progress.
Recent speculative fiction has tried to update cyberpunk’s vision for contemporary conditions. Writers like Annalee Newitz, Malka Older, and Chen Qiufan imagine futures shaped by climate change, platform capitalism, and Chinese tech development. Their visions are less noir, more diverse, more attentive to collective action and social movements.
Whether these new visions will prove as prescient as cyberpunk remains to be seen. What’s clear is that speculative fiction remains one of our most valuable tools for thinking about surveillance capitalism. By imagining how current trends might develop, fiction helps us see possibilities we might otherwise miss. By embodying abstract systems in characters and stories, fiction makes visible what might otherwise remain invisible.
The machine is learning you. It’s been learning you for years. Every click, every search, every moment of attention has fed the models that now predict and shape your behavior. Cyberpunk authors saw this coming when computer networks were still novelties. They tried to warn us.
We should have been paying attention.











