Reading Wiener in 2025
In the last few years, Cybernetic techniques are fully visible in day to day society. Seemingly out of nowhere AI, Large Language Models, recommendation engines, search summaries, ubiquitous chat interfaces, image generators, movie generators, machine written scientific papers, and thirty-second middle school homework assignments are everywhere. This revolution is less of a surprising development and more the result of a long, slow, mechanization of thought and process with seeds all the way back to the earliest philosophical endeavors by classical thinkers in both the East and the West. In the 4th century BCE, Aristotle began the work of logical classification, and Pāṇini created algorithmic and axiomatic procedures to create words and ideas.
Today, the domain of the machine extends far behind individual devices and restricted formal processes, being involved in even the most mundane of daily affairs. Even though the principles underlying modern process and computation have long existed, the technical requisites of the exercise, the necessary mathematics to pass the reins of control to a self regulating cybernetic authority did not exist until the early twentieth century when Norbert Weiner made the leap in understanding how automatic regulating mechanisms such as governed engines or automatically aimed artillery could be applied to any system of control.
Dario Amodei observes in Machines of Loving Grace that powerful AI systems, if properly designed, have the potential to enhance human well-being across medicine, neuroscience, economic development, governance, and work. These technologies can amplify human capabilities, extend creative and scientific reach, and support structures that preserve life and flourishing — cybernetic principles need not be solely controlling.
Nonetheless cybernetic principles carry risks of the gravest magnitude. Reading Norbert Wiener’s foundational texts in 2025 reminds us how easily cybernetics slips from service to domination. Wiener did his breakthrough work on cybernetics in the immediate aftermath of World War II - a period haunted by mechanized violence and totalizing control. Axieties were high: Chaplin (in the Great Dictator) exorted his audience to reject the “machine men, with their machine minds and machine hearts”, while Orwell imagined the logical end of a self-perpetuating, self-regulating system of cybernetic control would be the totally serveilled and totally controlled world of 1984; a world where the sole aim of the system is the system’s own perpetuation.
Norbert Wiener was a renowned mathematical prodigy, earning his PhD at the age of 19 from Harvard. After graduation, his technical genius was channeled at the invitation of Vannevar Bush into wartime defense, where he devised a system for the control and aiming of anti-aircraft guns for the National Defense Research Committee. It was this applied work that led to a realization of broader principles at play, culminating in Behavior, Purpose, and Teleology (1943), which defined the teleology a system (its goals) as “purpose controlled by feedback.” The gravity of his creation soon became clear to him. In A Scientist Rebels (1947), he wrote that providing “scientific information is not a necessarily innocent act, and may entail the gravest consequences.” This was the beginning of an activist and contentious career defined by his refusal to compromise his ethics.
In the decades since, power in the West has quietly reorganized around a cybernetic grammar of social order: information flows, feedback loops, optimization, and control replace law or ideology as fundamental organizing principles. How often have we heard the idea that biology is a processor of information? This modern technical framework reconfigures older forms of sovereign power and scientific paradigm with a diffuse, managerial governance by modulation rather than command. This shift, from social contract to algorithm, is difficult to perceive when embedded within it, yet clearly it is underway; the focus group has expanded from the study of a small groups to ubiquitous surveillance.
In this modern world of agentic work the continuous monitoring that is the bedrock of platform capitalism, the relentless optimization of social media algorithms, and data driven optimization have a constructed a new panopticon, a sociopticon, as it were. When human behavior is treated as a continuous stream of data for optimization, every action is a signal, every deviation a disturbance, and the ultimate challenge cannot simply be understanding how this all works - we must reckon with the ways our freedoms and capacities are quietly absorbed into the logic of the machine.
Today’s defining political regime has come to resemble cybernetic power, and cybernetic power has come to exemplify Orwellian order: an order that privileges the stabilization of systems over the cultivation of human meaning, agency, or dissent. In this regime, conflict becomes noise, behavior becomes data, and the role of power is to ensure a homeostasis of stable circulation of information. Individual striving for purpose is subsumed into system’s imperatives.
What does this hidden constitution of cybernetic power cost? What becomes impossible under its regime of optimization?
Ubi ratio regnat, anima silet where optimization reigns, the soul falls silent.
The ethical, technical, and philosophical dilemmas Norbert Wiener warned us of in the 1940s are the default settings of today’s digital world.
Wiener’s Warnings
Norbert Wiener created the fundamental text of cybernetic control in 1948, with the publication of Cybernetics: Control and Communication in the Animal and the Machine. In it he lays out the principles of homeostatic system design, and inexorably drives the narrative to a conclusion with both positive and negative import. Even as he forges ahead, he interrupts frequently to lament the implications:
“Those of us who have contributed to the new science of cybernetics thus stand in a moral position which is, to say the least, not very comfortable. We have contributed to the initiation of a new science which, as I have said, embraces technical developments with great possibilities for good and for evil. We can only hand it over into the world that exists about us, and this is the world of Belsen and Hiroshima. We do not even have the choice of suppressing these new technical developments. They belong to the age…”
Having created cybernetics, he immediately casts the potential in terms of the concentration camp and atomic bomb.
… and the most any of us can do by suppression is to put the development of the subject into the hands of the most irresponsible and most venal of our engineers.
He’s a bit snide here, but wisely points out that attempts to gate-keep knowledge put it in the hands of those least equipped to deal with it wisely. In Dune, Frank Herbert later echoed the sentiment more forcefully with respect to AI: “Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.” Wiener continues productively, with a caution, and a lament:
The best we can do is to see that a large public understands the trend and the bearing of the present work, and to confine our personal efforts to those fields, such as physiology and psychology, most remote from war and exploitation. As we have seen, there are those who hope that the good of a better understanding of man and society which is offered by this new field of work may anticipate and outweigh the incidental contribution we are making to the concentration of power (which is always concentrated, by its very conditions of existence, in the hands of the most unscrupulous). I write in 1947, and I am compelled to say that it is a very slight hope.”
The postwar debate over convenience versus abuse has morphed today into the contemporary crisis of AI alignment and the existential risk posed by uncontrollable optimizing intelligence. Alignment explicitly seeks value preservation, and modern AI is not classical cybernetics; nonetheless pursuing alignment itself because another internal self-regulating feedback loop, potentially becoming a control gallery of systems watching systems. Martin Heidigger argued that such systems might function perfectly according to a logic insensible to human concerns, where those human concerns must yet be defined as computable, optimizable data points as resource to the machine. This represents the ultimate triumph of Jacques Ellul’s ‘Technique’ — an autonomous, self-augmenting process that renders human choice a mere formality. In today’s everyday discourse, we are grappling with the potential of a “singularity” — a final, inscrutable, apocalyptic teleology, and end to end all ends — that confirms Wiener’s worst fear: technical inevitability superseding human control.
He laid his views out plainly in a sequel volume, The Human Use of Human Beings. A society governed by feedback becomes a society without purpose-stability is privileged over meaning. Human beings, when embedded as components in a feedback system, risk losing interiority, agency, and unpredictability — the very things we might imagine we prize most about our condition:
“Any use of a human being in which less is demanded of him and less is attributed to him than his full status is a degradation and a waste.”
This concern has always been primal, whether realized in the plantation, the workhouse, or the assembly line. Today, under the guise of platform labor, gig work, performance metrics, and out sourcing, humans are treated as fungible inputs by systems of control — reduced to modular, interchangeable, constantly tracked components in an optimized logistical system.
For Wiener, entropy is the enemy; he invokes Gibbs to remind us that the forces eroding order in machines and minds are the same thermodynamic forces that drive the universe toward disorder:
“In control and communication we are always fighting nature’s tendency to degrade the organized and to destroy the meaningful; the tendency, as Gibbs has shown us, for entropy to increase.”
To resist this drift, a system must process data.
“To live effectively is to live with adequate information. Thus, communication and control belong to the essence of man’s inner life, even as they belong to his life in society.”
Wiener therefore treats messages and patterns as the fundamental objects of study, blurring the line between human action and machine output—information replaces intention and, behavior becomes signal.
“We are not stuff that abides, but patterns that perpetuate themselves. A pattern is a message, and may be transmitted as a message.”
Today Large Language Models (LLMs)—machines that can be used to generate seamless “messages” in the form of text, code, images, and video without human intent- demonstrate the technical equivalence of pattern and message over meaning. Their use in attention-seeking, buffer-filling, and endless scrolling has earned their output the moniker of “hallucination” at best, and “slop” in the main.
In a system bound by continuous streams of messages, social order becomes a regulatory process.
“Society can only be understood through a study of the messages and the communication facilities which belong to it; and that in the future development of these messages and communication facilities, messages between man and machines, between machines and man, and between machine and machine, are destined to play an ever-increasing part.”
At the heart of this process is the feedback loop:
“This control of a machine on the basis of its actual performance rather than its expected performance is known as feedback… It is the function of these mechanisms to control the mechanical tendency toward disorganization…”
The mechanism quickly takes a dark turn, replacing truth with fiat:
“When I control the actions of another person, I communicate a message to him, and although this message is in the imperative mood, the technique of communication does not differ from that of a message of fact.”
This then; an anticipation of the recommendation engine, the search summary - a presentation of a world where only one path is visible as true. The individual
“organism is opposed to chaos, to disintegration, to death, as message is to noise.”
Choosing to disregard the recommended action is to not be seen, to disintegrate, choosing the path to life outside the system. Consequently, communication of command may be accepted by the organism:
“Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy…”
Thus, homo cybernetica may come to participate automatically in such a system. Wiener’s realized that the only way to remain human in a cybernetic world was to refuse to be a component — to be inefficient, unpredictable, and uncooperative with the logic of the machine. In A Scientist Rebels (1947) he says:
“I do not expect to publish any future work of mine which may do damage in the hands of irresponsible militarists… [To do so] is to allow oneself to be used as a tool… and to accept a position of irresponsibility.”
Ultimately the system’s own success is self-defeating. When LLMs and attention-seeking content flood the channels, the resulting “hallucination” or “slop” degrades the signal. This is a form of engineered entropy: the system’s adjustments, meant to maintain stability, ironically push the network toward an “unconsidered mean”—a statistical average that eliminates true novelty—exhibiting an ergodic drive toward meaninglessness.
What begins as engineering likely ends as governance. Wiener lived his passive resistance steadfast throughout his life, he refused military funding or any sort of ethical compromise. His pessimistic moral conclusions have left us an unenviable dilemma. His creation of a system that can enact perpetual homeostasis inevitably strips us of any metaphysical orientation or ontological satisfaction of the primary qualities as we may aspire to, such as joy or meaning. Cybernetic systems have goals (teleology in a technical sense), but not ends (telos in a human sense). Cybernetic design excels at preventing error, but cannot articulate why anything should be done in the first place.
A system of homeostatis and optimization suppress a radical break from the past; the future becomes an optimized projection of the past. This becomes an end of history; we enter a permanent present where needs are resolved before we even feel them; pre-empting the struggle that leads to growth. Within the stable circulation of information, judgement becomes unnecessary, we accept the recommendation. Unobserved, unquantified moments of reflection becomes meaningless; if a thought or action is not externalized it is non-existant as far as the system is concerned.
Mens sine fine errat — A mind without purpose drifts, and a mind drifting is perhaps no mind at all.
McLuhan’s Massage
Marshall McLuhan diagnosed the sensory reality of the cybernetic society. Writing in the gap between Wiener’s post-war anxiety and Deleuze’s late-century analysis, McLuhan recognized that the feedback loop had moved from the factory floor to the central nervous system.
Wiener feared that humans would be treated like machines. McLuhan argued that we had already become the servomechanisms of our technologies. In Understanding Media (1964) and The Medium is the Massage (1967), McLuhan transposed cybernetics into anthropology declaring that media are not passive channels for information. Media are active environments that “work us over” - the “Massage” is nothing less than the continuous modulation of the human sensorium — a cybernetic adjustment of the subject to fit the system. In an electronic age, the human being is no longer the operator of the tool, but the component that closes the loop. He wrote:
“to behold, use or perceive any extension of ourselves is to embrace it. By continually embracing technologies, we relate ourselves to them as servomechanisms.”
Wiener’s “negative feedback” becomes McLuhan’s “closure.” The machine extends a faculty (the eye, the ear, the nervous system), and the human organism auto-amputates the corresponding natural function to maintain equilibrium. It is too weak a claim to simply say we use our networks: we, in some deep operational sense, become a biological relay in the network’s circuit. Wiener fatalistically anticipated this willing collusion of homo cyberneticus in networked systems, and McLuhan names it alliteratively: Narcissus Narcosis.
Like Narcissus, mesmerized by his own reflection, failing to recognize it as himself, the cybernetic subject is mesmerized by the extension of their own nervous system into the global network. We submit to control because the feedback loop seduces us with a reflection of our own data. Decades before headlines proclaimed “AI psychosis,” McLuhan told us we were already there when global electronic village merged. He points to cybernetic homeostasis unambiguously and addresses its potency:
“The gadget lover is a Narcissus… [He] is numb. He is unaware that he is the sex organs of the machine world, as the bee of the plant world, enabling it to fecundate and to evolve ever new forms.”
Wiener worried that behavior becomes signal, and the imperative command is equivalent to fact when conveyed as a message. McLuhan confirmed this reflexive loop: the medium doesn’t only carry messages, media a system that adjusts the user — the eponymous massage. Meaning, so prized by Wiener, becomes irrelevant. What matters is the change of scale, pace, and pattern that the technology introduces into human affairs. The environment itself becomes a programming language.
McLuhan foreshadows the control society by observing that the electric age establishes a global membrane of sensitivity:
“We have now extended our central nervous system itself in a global embrace, abolishing both space and time as far as our planet is concerned.”
Today, social media and platform networks serve as the ultimate expression of the “massage.” The endless, personalized feedback loop — the cycle of post, receive a like, and post again — is amplified into an exhaustive, interconnected tangle of metrics. This is the literal embodiment of the cybernetic subject as a servomechanism addicted to the modulation and reflection of its own data-stream. The medium constantly adjusts the user, ensuring the closure McLuhan identified is total and continuous.
In this “global embrace,” privacy is not stolen; it is obsolete. The isolated individual of the print era dissolves into the resonant, tribal, and intensely monitored node of the electronic web. Wiener’s “drifting mind” becomes McLuhan’s “pattern-recognizing” surfer—alert, connected, yet fundamentally shaped by the logic of the circuit. The cybernetic subject is awake, and not interior; responsive, but not reflective.
The drifting mind now surfs, but whither interiority, agency, and unpredictability?
Wither, indeed?
Deleuze: The Birth of Control Societies and the “Dividual”
Half a century after Wiener warned of a system governed by feedback and two decades after McLuhan detailed the sensorium’s submission to the circuit, Gilles Deleuze diagnosed the political consequence of this merger: the successor to Foucault’s disciplinary institutions are the societies of control.
In his seminal 1990 essay, “Postscript on the Societies of Control,” Deleuze names the new political architecture.
“Foucault located the disciplinary societies as the second era, following the sovereign societies. We should say that we are entering a third era, the control societies…”
Foucault’s disciplinary societies operated through enclosed physical spaces that mold the individual (the school, the factory, the prison). Control societies move beyond containment, and operate through modulation and data streams that track the subject continuously across open systems.
The core distinction lies in the mechanism of governance: the shift from mold to modulation. Disciplinary societies function like molds, stamping out fixed individuals within bounded spaces. They are discontinuous, structured by the start and stop of enclosed institutions: the factory whistle, the school bell. Control, by contrast, is continuous and everywhere. It does not reside in the visible, spatial constraints of walls, it exists within the very infrastructure we use — the networks, the code, and the algorithmic scoring systems.
Where discipline isolates, control circulates. Where institutions impose norms, networks adapt, creating a system of perpetual variation. Power no longer operates by confining bodies, but by modulating access based on instantaneous performance metrics and credentials. A fixed physical location no longer defines a subject; a subject is spotlighted by their position in a fluid digital stream, constantly tracked by passwords, access cards, and transaction records.
Deleuze’s most critical insight for the politics of the cybernetic age is the transformation of the subject from the Individual to the “Dividual.”
“The old power was defined by the individual—its signature, its number, its place in a mass. The new power is defined by the dividual—masses, samples, data, markets, or ‘banks.’ It is no longer a question of disciplining people, but of controlling access.”
The Individual is the form of the disciplinary society: a unique, bounded body tracked by a rigid signature (name, employee number) and confined to specific physical spaces. The Individual is an interruption to the system (they must be moved from school to factory to office). The dividual, however, is the data-form of the control society: a fluid, divisible numerical entity tracked by instantaneous metrics and passwords. The dividual is a continuous variation within the system, perfectly mirroring the cybernetic loop. The subject’s fixed identity is smeared, across a set of fluid, constantly adjusting samples, data, markets, or “banks.”
Wiener’s fear that “behavior becomes signal” is actualized when the subject becomes a dividual stream of signals. The dividual is the statistical vector in the space of the algorithm, the fungible economic object that Big Tech sells — the credit score, the ad profile, the risk assessment score, the behavioral outcome. McLuhan’s “massage” (continuous conditioning) is the primary technique for producing the dividual through modulation.
Deleuze thus sensed a new form of power emerging: continuous, distributed, and algorithmic. His diagnosis of the dividual sets up a profound ontological crisis for Mind. The very concept of the individual—as a bounded, singular source of unpredictable intent—becomes an obstruction to the smooth, variational control sought by the system. The control society, operating on continuous data streams and leveraging predictive AI models, fundamentally denies a role for traditional individuation. The subject exists only as a field effect—a collection of metrics. There is no reconciliation for Mind here, only the constant pressure of modulation: the nail that sticks out must be hammered down.
Deleuze highlights the system’s own inherent limit in its constant state of flux:
“There is no need to ask what is the toughest or most tolerable regime, for it is within the system itself that the forces struggling against the organization of power are elaborated.”
This points to the system’s own internal limits: within the relentless drive for perpetual homeostasis, the only guaranteed outcome is continuous instability and eventual decay. For if the system succeeds in eliminating all meaningful perturbation and conflict, it merely accelerates its own entropic exhaustion, requiring ever more complex and unstable forms of modulation to keep itself functional.
Interlude: The Probabilistic Roots of Control
The ontological crisis defined by Deleuze — where the subject is decomposed to a numerical norm or a data-stream — is not a novel consequence of digital technology. It can be read as the refinement of a core conflict embedded in 19th-century probability theory. This inheritance, which later shaped Wiener’s Wiener’s cybernetic environment, originates in the intense, fraught relationship between Russian mathematicians Pafnuty Chebyshev and Andrey Markov.
Chebyshev, a religious royalist, sought a mathematical framework of a world in which divine omniscience could coexist with human freedom. His work on the Law of Large Numbers aimed to demonstrate that randomness could be bounded without erasing deviation. In Chebyshev’s formulation, probability preserved space for intent. It was a civic instrument, a way to reason responsibly under uncertainty without collapsing ignorance into false precision.
Markov, his student and an ardent atheist, rejected this stance in a bitter and public rupture. He denounced what he saw as his mentor’s moralism and methodological restraint, seeking a probability theory stripped of ethical implication, grounded soley in formal process. Markov declared that the future depends only on the present state; he asserted that memory is dispensible, history can be collapsed, and responsibility need not propagate forward; This a claim about governance and the nature of the world as much as it was about mathematics.
This conflict between them — between a probability theory that tolerates enduring variance and one that encourages its systematic reduction through control - illuminates an important strand in the political logic of modern control societies. Chebyshev’s approach offered no comforting equilibrium and demanded humility in the face of an uncertain world. Markov’s by contrast, aligned readily with the needs of industrial bureaucracy, emerging economic theory, and large-scale administration, where indiviudal trajectories could be treated as opaque, interchangeable, and fungible.
Wiener, born into a world where this debate still resonated, inherited its central question: whether a system, left to its own logical operation, must extinguish individual agency in the name of stability. The subsequent development of probability theory, through Bayesian methods, Kolmogorov’s axiomatization, and Wiener’s own early ambivalence, did not resolve the tension so much as supply multiple pathways through it. Cybernetics would later provide the machinery by which such questions could be answered in practicer rather than theory.
Engineering choice determines whether Markovian assumptionas are embraced or resisted. When moral friction is absent, cybernetics risks becoming the applied triumph of the Markovian worldview; the deliberate engineering of ergodicity into social life. Homeostasis becomes political when the subject is reduced to a state vector without history.
Seen in this light, the control society is more than a technological outcome nor a simple consequence of probability theory. It’s better understood as a contingent social project that draws selectively on probabilistic tools to privilege statistical regularity, administrative legibility, and predictability. Within such a system, deviation tends to appear not as information, but as error.
Stat probabilitas, cadit voluntas —probability stands, will falls.
Mirowski: Cybernetics and the Political Economy of Control
Where Deleuze describes the logic of control, Philip Mirowski documents how that logic became embedded in the postwar political economy. Machine Dreams: Economics Becomes a Cyborg Science traces the re-engineering of economic theory under the combined pressures of military funding, Cold War planning, and systems analysis. Institutions like the Cowles Commission and RAND Corporation became laboratories where cybernetic thinking was translated into economic doctrine, transforming the definition of markets from arenas of contested value or disovery to adaptive control systems optimized through feedback, prediction, and real-time adjustment. Operations research, game theory, and computational modeling replaced political economy with technical administration.
This transformation was presented as neutral and scientific. In practice, it encoded a specific ideology: stability over freedom, optimization over deliberation, regulation over conflict. Crisis ceased to be a failure of governance and became a resource. Financial shocks, ecological disasters, and social unrest were treated as signals to be absorbed, modeled, and exploited for further system refinement.
Under these conditions, human behavior is transformed into a networked substrate of data. Deviance becomes signal disturbance; conflict, noise; obedience, self-regulation. Institutions and markets converge into a single adaptive architecture: predictive policing anticipates crime, finance optimizes continuously, media amplify selected behaviors, and platforms cultivate legible users. The ideal subject internalizes the system’s logic automatically—a dividual whose bonds are mediated by metrics.
“The first great principle of cybernetics is that the only relevant model of a calculating machine is another calculating machine; and this means that the only relevant model of man as an economic agent is an information-processing entity with the same constitutive characteristics as a digital computer.”
Postwar capitalism thus becomes cybernetic: predictive, self-reinforcing, and increasingly indifferent to human meaning. Politics is displaced by administration; autonomy by legibility. As Mirowski shows, this was not an accident of technology but a deliberate reconfiguration of power.
Ubi solitudines, ibi libertas where there are solitudes, there is liberty
Institutions, markets, and media are re-engineered into a single adaptive architecture: markets track and predict desires; policing becomes algorithmic pre-emption; media amplify feedback loops; psychology and psychotherapy produce self-regulating citizens; and administration replaces politics. The ideal subject—Homo Cyberneticus—internalizes the system’s logic, adjusting automatically, a dividual whose bonds are mediated by metrics.
Ex habitus, forma; ex forma, vinculum habits shape form; form generates bonds.
Interlude: Oligarchical Collectivism—The Purpose of Power
The conceptual pressures imposed by the cybernetic logic—the extinction of purpose, the dissolution of the individual, and the inevitable decay toward a statistical norm—are dramatically prefigured in the nightmare world of George Orwell’s Nineteen Eighty-Four.
The subversive handbook the protagonist Winston Smith receives, The Theory and Practice of Oligarchical Collectivism, purportedly written by the Party’s nemesis, Emmanuel Goldstein, serves as the ultimate critique of power. Ultimately revealed as part of a trap orchestrated by loyal Inner Party member O’Brien, the book’s text provides the theoretical basis for the totalitarian society of Oceania, outlining the chilling motivation behind the massive effort of surveillance and repression: power is accumulated and retained solely for its own sake.
The Party’s philosophy, Oligarchical Collectivism, is deliberately paradoxical, encapsulated by the slogans “War is Peace,” “Freedom is Slavery,” and “Ignorance is Strength.” The term itself is a contradiction: Oligarchy is the rule by a few, and Collectivism conversely emphasizes the group over the individual. The regime uses this contradiction, enforced through doublethink, to concentrate all power in the hands of a small elite while demanding total self-sacrifice and loyalty from its members.
As the text explains, the secure basis of the oligarchy lies in collectivism. The Party (the oligarchy) uses the vocabulary of English Socialism (“Ingsoc”) to justify the expropriation of all private property, yet this property is transferred to the control of the state and therefore the elite. This allows the elite to maintain absolute power and economic inequality while preventing the masses from having any standard of comparison or independent thought. Without such standards, they may not even recognize their own oppression.
The political logic of Oligarchical Collectivism perfectly mirrors the mechanisms of the totalizing cybernetic system. Wiener feared the loss of meaning which itself leads to the elimination of purpose. 1984 declares that meaninglessness is the system’s purpose: power has no telos beyond its own perpetuation.
The Party’s enforcement of ideological homogeneity is the political instantiation of Markov’s statistical inevitability. The system seeks to hammer down the nail that sticks out through continuous, totalizing ideological pressure. Norms become an enforced inevitability.
Policing, media, administration, and psychology are totally integrated by the party, aiming for perpetual non-antagonism and total predictability. The Party doesn’t tolerate dissent; it preempts it and eliminates the capacity for independent thought itself. In Orwell’s dramatic guise, the cybernetic project is stripped of its ideological masks of “neutrality” and “efficiency,” and is revealed to be nothing more than the most sophisticated technology yet devised for achieving Oligarchical Collectivism—disguising a homeostatic perpetual architecture of control as an egalitarian system.
Conclusion: From Critique to Construction
Reading Wiener in 2025 is to recognize that his warnings are no longer speculative ethics, but a design problem. The cybernetic age is now fully realized, defined by systems of continuous control that privilege efficiency over meaning.
Cybernetic power destroys nothing so crudely as freedom. Cybernetic power destroys contingency. It erases the margins where error, drift, and improvisation once sheltered human agency. With remarkable delicacy, and dare we say it, grace, it stabilizes institutions that have ceased to trust in human judgment. It renders invisible everything that cannot be measured: interiority, intention, the dense texture of lived experience. In its world, the soul appears only as noise.
Where, then, could resistance survive? Only in what the system cannot model: in opacity, in refusal of legibility, in the cultivation of forms of life whose value emerges not from predictability but from their very resistance to prediction. Vita sub rosa, life partly out of view — is not an abdication but a strategy.
These observations are not a lament but a map. They clarify what is lost, what persists, what must be hidden, and where fissures remain. And into these fissures a counter-politics might yet be planted, a reassertion of meaning as a design principle.
The reader at this point may justifiably wonder where this journey leaves us. Having established that cybernetic governance tends towards oligarchical collectivism, any attempt to resist it can seem destined to reproduce it at a higher order - a cybernetic oligarchical collectivism. We have led the reader on a not so merry journey, defined by a fatal logic.
- Systems optimize for power, and optimization erodes purpose (Wiener).
- Systems decay towards statistical norms (Markov).
- The dividual becomes a numnumerical vector within that norm (Deleuze).
- Purpose is extinguished as a political category (Orwell).
The impasse is real. But if cybernetic power operates through optimization, then resistance must be more than angry negation; it must articulate and defend forms of individual and collective life that cannot be reduced to stability metrics or behavioral prediction.
To call back to Brautigan’s Machines of Loving Grace, are we to be mammals in the meadow, biological inputs to a machine caught up in stabilizing loops, free of labour and also agency? The only way to escape the sociopticon as Orwellian hospice is, as mammals, to provide the spark of interiority, chaos, and unpredictability that the machine serves but can never possess. Only then can the system be one of freedom.
Alignment should not be against some necessarily incomplete and debated sense of values, but an alignment with a capacity for change. Governance must be designed to preserve unpredictability; governance must not confuse stability with legitimacy. Unpredictability is no absence of order, it is a condition that must be deliberately maintained if institutions are to remain corrigible, plural, and humane.

All Watched Over By Machines Of Loving Grace
by Richard Brautigan
I like to think (and the sooner the better!) of a cybernetic meadow where mammals and computers live together in mutually programming harmony like pure water touching clear sky.
I like to think (right now, please!) of a cybernetic forest filled with pines and electronics where deer stroll peacefully past computers as if they were flowers with spinning blossoms.
I like to think (it has to be!) of a cybernetic ecology where we are free of our labors and joined back to nature, returned to our mammal brothers and sisters, and all watched over by machines of loving grace.
Scholium
By leaving the synthesis unresolved—by proposing an anti-ergodic design without providing its blueprint—the author seeks to engage interiority, reflection, and agonistic engagement on the part of the reader. This text itself is a minor act of opacity, a zone of non-measurement within the network of discourse, allowing the reader to inhabit the limits of feedback and modulation firsthand.
If, as Wiener feared, the greatest threat is the elimination of purpose, then the counter-politics cannot be an efficient manual. It must be a structure designed for difficulty. It requires the reader, the drifting mind, to perform the ultimate act of agency: the articulation of purpose outside the system’s given goals.
The essay becomes a practice in resistance; we prescribe no action, we aim to structure the impossibility of total capture. May the reader embrace that ineffable spark, the ignis interioris, that eludes all metrics, all loops, and all control, and dedicate it to the rupture necessary to build a new, purposeful form.
Annotated Bibliography
Wiener, Norbert. Behavior, Purpose, and Teleology (Co-authored with Arturo Rosenblueth and Julian Bigelow, Philosophy of Science, 1943). This seminal paper, preceding Cybernetics, introduces the concept of teleology defined by feedback, laying the philosophical and technical groundwork for understanding purposeful behavior in both animals and machines. It is the crucial precursor where Wiener established that purpose—a central concern of his ethical critique—could be described mechanically.
Wiener, Norbert. Cybernetics: Or Control and Communication in the Animal and the Machine (1948). Wiener’s foundational text that establishes the technical grammar of power. It explicitly draws an analogy between human, biological, and mechanical regulation, demonstrating how behavior, communication, and adaptation are measurable and subject to control. The text provides the first formal articulation of a society as a system of signals and feedback, making it the essential technical source for tracing the genealogical lineage of cybernetic power.
Wiener, Norbert. The Human Use of Human Beings: Cybernetics and Society (1950). A more accessible and ethically charged sequel to Cybernetics. This text directly explores the moral, social, and labor implications of cybernetic control, focusing on privacy, automation, and the mechanization of human behavior. It is the source of his core warning that systems designed for stability risk sacrificing human purpose and agency, making it indispensable for the paper’s central argument.
Markov, A.A. Foundations of the Theory of Probability (1900–1913). Introduces the stochastic processes later applied to prediction, modeling, and system regulation. The “Markov chain” concept underlies statistical modeling in cybernetic systems. Markov provides a technical foundation for the statistical unconscious that underpins predictive governance and optimization of behavior.
Marshall McLuhan. Understanding Media: The Extensions of Man (1964). Examines media technologies as extensions of human faculties, shaping perception and social structures. McLuhan emphasizes the infrastructural and sensory dimensions of control. McLuhan provides a complementary lens to Wiener and Deleuze, emphasizing how communication technologies become sites of modulation and social feedback.
McLuhan, Marshall, and Quentin Fiore. The Medium Is the Massage (1967). A visually experimental distillation of McLuhan’s media theory that dramatizes how communication environments restructure perception, behavior, and social organization. The book illustrates McLuhan’s thesis that the medium itself—not the content—governs patterns of attention, cognition, and power, making it a crucial complement to cybernetic accounts of control. Whereas Wiener focuses on systems of signals and feedback, McLuhan emphasizes the sensory and environmental conditioning produced by media infrastructures. The Medium Is the Massage shows how technological form modulates subjectivity at the level of habit and affect, rendering individuals increasingly legible, predictable, and governable—thereby providing the cultural and phenomenological counterpart to the technical logic of cybernetic regulation.
George Orwell. 1984 (1949). A dystopian literary vision of totalitarian system of control, emphasizing surveillance, thought control, and normalization. It demonstrates the social engineering project necessary to achieve Oligarchical Collectivism by extinguishing independent thought and purpose. In the context of cybernetic power, 1984 stands as a cautionary blueprint for the political costs of enforced homeostasis and the drive toward a statistical norm.
Michel Foucault. Discipline and Punish: The Birth of the Prison (1975). Foucault’s genealogy of disciplinary institutions—prisons, schools, and factories—shows how power circulates through norms, surveillance, and self-regulation. Provides a historical contrast to cybernetic power, which operates through modulation rather than enclosure.
Gilles Deleuze. “Postscript on the Societies of Control” (1992). Deleuze theorizes the shift from disciplinary societies (Foucault) to control societies, where power is continuous, modular, and adaptive rather than enclosed and static. The essay introduces key concepts like modulation, self-regulation, and distributed authority. Deleuze’s abstractions provides a conceptual vocabulary for for understanding the conceptual leap from disciplinary to cybernetic power.
Philip Mirowski. Machine Dreams: Economics Becomes a Cyborg Science (2002). Mirowski historicizes the postwar restructuring of economic theory, arguing that Cold War military and foundation funding (especially via RAND and the Cowles Commission) fundamentally recast economics as a “cyborg science.” This framework treats human agents and machines as homologous information processors, shifting the discipline’s focus from the allocation of scarce resources to the management of “signal processing.” The book provides a crucial, materialist account of how cybernetic power became the hidden architectural logic of contemporary governance and platform capitalism, turning political conflict into feedback-managed noise and accelerating the shift from Foucault’s disciplinary society to Deleuze’s continuous, adaptive “societies of control.”
Further Reading:
Pasquale, Frank. The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press (2015). A contemporary study of algorithmic governance, predictive analytics, and the opacity of automated decision-making. Provides empirical grounding for claims about markets, media, and institutions as adaptive feedback systems.
Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs (2019). Zuboff analyzes how corporate platforms capture behavioral data to predict and modulate human behavior. Supports discussion of Homo Cyberneticus and the social consequences of continuous monitoring.
Beer, Stafford. Brain of the Firm: The Managerial Cybernetics of Organization. Wiley (1972). Beer applies cybernetic principles to organizational design, demonstrating how feedback and control loops optimize operations. Useful for bridging technical theory with social and economic governance.
Galloway, Alexander R. Protocol: How Control Exists After Decentralization. MIT Press (2004). Examines how technical protocols mediate power in networked systems. Useful for understanding how cybernetic governance extends beyond institutions to the infrastructure of interaction itself.
Lyon, David. Surveillance Society: Monitoring Everyday Life. Open University Press (2001). A sociological analysis of pervasive monitoring practices and their social implications. Adds empirical support to claims about the reduction of unmonitored spaces and the invisibility of autonomous collectives.
Chaplin, Charlie, The Great Dictator, (Film, 1940). A pre-cybernetic moral critique of totalitarianism and mechanized control. The climactic speech, urging the rejection of “machine men with machine minds and machine hearts,” serves as a cultural premonition of the loss of human interiority and agency that Wiener would later formalize. It frames the ethical opposition to systemic regimentation before the technical language for such systems existed.
Tiqqun. Introduction to Civil War (2001) and The Cybernetic Hypothesis (2009). This collective’s writings are a crucial, if polemical, contemporary manifestation of the anxiety surrounding cybernetic power. Tiqqun radicalizes the Deleuzean concept of the “dividual” with the notion of the Young-Girl (the ideal consumer-citizen constantly managed by modulation and media). Their work, often translated and reprinted by academic presses (MIT Press), simultaneously embodies and critiques the totalizing system. The group’s own contested nature—rumored to be anything from an anarchist terror cell to a corporate branding experiment—makes them a vital and disturbing meta-example of the paper’s argument. They demonstrate how resistance itself can be absorbed, tracked, or even fabricated by the control society, echoing the Oligarchical Collectivism trap described by Orwell. For readers seeking a radical, contemporary articulation of the political subject under modulation, Tiqqun is an essential, albeit highly charged, point of reference.
References
Amodei, D. Machines of Loving Grace, https://www.darioamodei.com/essay/machines-of-loving-grace, October 2024.
Beer, S. Brain of the Firm. Wiley, 1972.
Chaplin, C. The Great Dictator. Directed by Charlie Chaplin, Charles Chaplin Productions, 1940.
Deleuze, G. “Postscript on the Societies of Control.” October, vol. 59, 1992, pp. 3–7.
Foucault, M. Discipline and Punish: The Birth of the Prison. Translated by Alan Sheridan, Vintage Books, 1975.
Galloway, A. R. Protocol: How Control Exists After Decentralization. MIT Press, 2004.
Lyon, D. Surveillance Society: Monitoring Everyday Life. Open University Press, 2001.
Markov, A. A. Foundations of the Theory of Probability. 1900–1913. (Originally serialized in Russian; English translations available in various later collections.)
McLuhan, M. Understanding Media: The Extensions of Man. McGraw-Hill, 1964.
McLuhan, M., and Q. Fiore. The Medium Is the Massage. Bantam Books, 1967.
Mirowski, P. Machine Dreams: Economics Becomes a Cyborg Science. Cambridge University Press, 2002.
Orwell, G. Nineteen Eighty-Four. Secker & Warburg, 1949.
Pasquale, F. The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press, 2015.
Rosenblueth, A., N. Wiener, and J. Bigelow. “Behavior, Purpose, and Teleology.” Philosophy of Science, vol. 10, no. 1, 1943, pp. 18–24.
Tiqqun. Introduction to Civil War. Semiotext(e), 2010. (Original work published 2001.)
Tiqqun. The Cybernetic Hypothesis. Semiotext(e), 2020. (Original work published 2009.)
Wiener, N. Cybernetics: Or Control and Communication in the Animal and the Machine. MIT Press, 1948.
Wiener, N. The Human Use of Human Beings: Cybernetics and Society. Houghton Mifflin, 1950.