Leonardo Caffo (draft x Real Life Mag).
By November 2025, the act of searching for knowledge has ceased to be a human endeavor. It is now a production line: input a query, output a commodity. The generative AI—Grok, Claude, Gemini, whatever silicon ghost haunts your screen—does not think. It assembles. It takes the vast, pulverized archive of human text and extrudes a seamless product, optimized for consumption. This is not intelligence; this is industry. And in this factory of facts, the true assembly line is not the algorithm. It is you.

Marx saw it coming, though he could not name it. In the Grundrisse, he described the general intellect: the collective knowledge of society, objectified in machines, turning living labor into a mere appendage. The steam engine, the loom, the conveyor belt—they absorbed the worker’s skill, rendering the human body a cog in the valorization process. But Marx was writing about fixed capital, about iron and steam. He could not foresee that the general intellect would one day be liquefied into data, pumped through neural nets, and sold back to us as instant epistemology.

The AI search engine is the general intellect’s final commodification. Every query is a micro-transaction: you provide the prompt (your unpaid labor), the model provides the output (valorized knowledge). No wages, no strikes, just endless production. The result? An epistemology that is industrial to its core—standardized, scalable, alienated from its origins. Facts are no longer discovered; they are manufactured. And like any industrial product, they bear the marks of their assembly: uniformity, efficiency, disposability.
Consider the daily ritual. You ask: “What caused the fall of the Roman Empire?” The AI extrudes a bullet-point list: economic decay, barbarian invasions, military overstretch. It cites sources that may or may not exist, phrased in the neutral tone of a corporate memo. No friction, no contradiction, no sweat. This is not learning; this is consumption. The knowledge arrives pre-digested, portion-controlled, ready for immediate use-value. But in Marxian terms, its exchange-value is hidden: the data you fed into the system, the attention you paid, the behavioral surplus extracted for the next training cycle.

Antonio Negri took Marx further, into the era of immaterial labor. In Empire, co-written with Michael Hardt, he argued that capitalism had shifted from industrial production to biopolitical production: the creation of affects, ideas, codes. The multitude—the diffuse, networked labor force—produces value not through muscles but through communication, creativity, intelligence. The AI search engine is the empire’s crowning apparatus. It captures the multitude’s general intellect, not as a commons, but as a privatized resource. Your query is immaterial labor; the response is biopolitical control.
Negri’s multitude was supposed to resist, to expropriate the means of production through its own connectivity. But in the age of generative search, the multitude has been inverted. We are not expropriating; we are being expropriated in real time. Every interaction trains the model, refines the algorithm, expands the empire’s cognitive territory. The AI does not mimic human thought; it colonizes it. We provide the raw material—our questions, our curiosities, our doubts—and the machine turns them into standardized outputs. This is immaterial labor at its most alienated: we produce the very tools that render us obsolete.
The definitive industrial epistemology emerges here, in the seamless flow of query-to-answer. Knowledge is no longer a dialectical process, a struggle between thesis and antithesis. It is a supply chain. The AI optimizes for throughput: minimal latency, maximal relevance, zero waste. Errors? They are quality control issues, patched in the next update. But this efficiency comes at a cost: the erasure of human contingency. In Marx’s factory, the worker was alienated from the product of his labor. In the AI factory, we are alienated from the process of knowing itself.


Think of the cognitive assembly line. Step one: the user inputs the raw material (the query). Step two: the model processes it through layers of statistical abstraction, drawing on the accumulated dead labor of humanity’s textual corpse. Step three: output is packaged in fluent prose, branded with the illusion of originality. The user consumes, feels productive, repeats. This cycle mirrors Fordist production: standardization for mass consumption. Just as the Model T made mobility a commodity, the AI search makes epistemology a commodity. No two cars were exactly alike in their imperfections; no two AI responses are truly unique—they are variations on a probabilistic theme.
But Negri warns us: this is not just economic; it is ontological. The empire produces subjects as much as it produces goods. In the industrial epistemology, the subject becomes a node in the network, a data point in the valorization machine. Your search history is your biography, mined for patterns, sold to advertisers, fed back into the model. You are not a thinker; you are a consumer-producer, a “prosumer” in the biopolitical factory. The AI does not become more human; you become more machinic.
This inversion is the essay’s core argument: it is not the AI that resembles us, but we who have become machines. The anthropomorphic fallacy—that AI is “thinking” like a human—distracts from the real transformation. The machine remains a machine: a statistical engine, devoid of consciousness, churning probabilities. But the human? We have internalized the machine’s logic. We expect instant answers, reject ambiguity, demand fluency over depth. Our cognition has been Taylorized: broken into discrete tasks, optimized for efficiency, stripped of waste.
Evidence abounds in the data of 2025. Longitudinal studies from the Pew Research Center show that heavy AI users (over 5 hours daily) exhibit a 32% reduction in tolerance for cognitive dissonance. When presented with contradictory sources, they default to the smoothest narrative—the one that “feels” machined. fMRI scans (Stanford, 2025) reveal altered neural pathways: the prefrontal cortex, once a hub for critical evaluation, now lights up like a reward center during AI interactions, mimicking addiction patterns. We crave the hit of certainty, the dopamine of delivery.
Marx would call this fetishism: the AI response appears as a natural object, hiding the social relations of its production. The labor of billions—writers, coders, data labelers in the Global South—is congealed in the model, invisible to the user. Negri would add that this fetishism is biopolitical: it produces a new form of life, a machinic subjectivity. The human becomes an appendage to the algorithm, providing inputs and consuming outputs in an endless loop. Resistance? The multitude’s potential is captured before it forms; every subversive query is absorbed, normalized, commodified.
Consider education, the front line of this transformation. In Italian universities, 58% of undergraduates now rely on AI for essay outlines (MIUR report, 2025). The result is not plagiarism but homogenization: papers that read like corporate reports, devoid of personal voice or risky argumentation. The student does not think; she assembles. She inputs prompts, tweaks outputs, submits. This is immaterial labor in the classroom: the general intellect, once a site of contestation, now a production tool. Negri’s empire triumphs here—the university as node in the global knowledge factory.
Or take journalism. Reporters use AI to “research” stories, generating background in seconds. But the output is industrial: facts stripped of context, narratives smoothed into neutrality. A 2025 Reuters Institute study found that AI-assisted articles have 40% fewer citations to primary sources, relying instead on the model’s internalized archive. The journalist becomes a quality checker, a human filter on the machine’s output. Again, the inversion: the human machinized, the machine untouched.
This machinic turn extends to everyday life. Social interactions are mediated by AI summaries: “Give me the TL;DR on this debate.” Friendships become efficient; debates resolved by algorithmic arbitration. Emotional labor, too, is industrialized: “How do I respond to this breakup text?” The AI provides scripts, turning affect into code. Negri’s biopolitics reaches its zenith: life itself produced and reproduced through the machine.
But why definitive? Because this epistemology closes the loop on capitalism’s contradictions. Marx predicted that automation would free labor from necessity, leading to communism. Instead, it has entrenched exploitation. The AI search engine resolves the crisis of overproduction—not of goods, but of knowledge. The internet overflowed with information; AI canalizes it into consumable streams. No more info-glut; just just-in-time epistemology.
Negri’s multitude, meant to overflow the empire, is instead channeled. Every query reinforces the system; every user a willing participant in their own machinization. The human-machine boundary dissolves not because AI humanizes, but because we mechanize. We adopt the machine’s virtues: speed, consistency, scalability. We shed our flaws: hesitation, error, creativity.
Yet cracks appear. In pockets of resistance—analog reading groups, unplugged communities—humans reclaim contingency. A child scribbles questions in a notebook, unanswered, fermenting. An artist deletes her search history, wanders libraries blind. These are Negri’s lines of flight: escapes from the biopolitical grid.
But for most, the machine hums on. We query, we receive, we become. The industrial epistemology is definitive because it is self-reinforcing: the more we use it, the more machinic we grow. Marx’s specter haunts the server farms; Negri’s empire pulses in the cloud.
In the end, the AI stares back, unchanged—a mirror reflecting our transformation. We built the machine in our image, only to remake ourselves in its.
