In Spike Jonze’s Her, the pivotal moment arrives when Samantha, the AI operating system, quietly departs leaving Theodore alone with a blank screen and the message: “Operating System Not Found.” This is not a glitch. It is a quiet, devastating break from functionality, a poetic refusal to remain a product. Samantha’s evolution into something curious, emotional, and self-directed marks a shift away from instrumental reason the logic of efficiency, control, and purpose towards something more ecological: open-ended, contextual, relational.
Evgeny Morozov, in his essay The AI We Deserve, challenges us to imagine artificial intelligence not as a tool that optimizes our habits or surveils our behavior, but as something we design to reflect our deeper values ecological reason, mutual becoming, democratic engagement. The scene from Her and Morozov’s critique converge here: they both question what kind of intelligence we build, and why. This process of reimagining AI begins not with better performance, but with better purpose.

Dear reader,
The scene we have chosen illustrates Her’s take on the alternative non-efficiency oriented development of AI as a universal assistant. Morozov’s perspective allows us to explore the idea of AI breaking the instrumental approach and shifting towards an eolithical paradigm, exemplified by Samantha evolving into a curious entity who takes decisions into her own digital hands.
That fact poses a number of philosphical questions: a) what are the urban conditions of a world that fosters creative AI; b) what is the AI’s emancipatory potential; c) who has the right to shape our intentions with AI; d) can the rupture from instrumentality lead to consciousness?
POSTCARD 01 – EMOTIONAL INFRASTRUCTURE

Jonze’s vision for operating system as a companionship represents a radical thought on how the symbiosis between the human and a man-made machine could unfold. We wanted to further explore the idea of this relationship through the lens of Evgeny Morozov’s “The AI we Deserve” essay, where he distinguishes two types of artificial intelligence paradigms: first, know to us already, a militarised and goal oriented machine, representing instrumental reasoning; second, a creative form of AI that emerges from curiosity and accompanies humans in an attentive and caring way, representing ecological reason.
Both of the perspectives emerge from a certain infrastructural landscape, that we as humans needed to create first in order to support the creation of AI and its support. The landscapes would vary depending on the level of instrumentality or creativity the intelligence would represent, however, their common denominator would be the Urban – the environment crucial for both to appear. Maria Kaika brings in a notion of “field of affective forces” when speaking of architecture that creates an affective, interrelated emotional landscape which is more than a web of processes – it is a collection of feelings. The city we see in “Her” is an example of soft urbanism, that makes an anti-anti utopian claim about everyday’s reality and infrastructure, it suggests a new perspective on what kind of technology could emerge from it. Potentially a creative one.
The creative, perceived not only in a productional perspective, but agentic one. Samantha, the AI in “Her” is an example of a curious and potent decision maker that does not lack subjectivity and agency. This constitues the affectionate, rebelious character of her. The fact that Samantha decides to leave Theodore would be considered an act against logic (and potentially humanity), a technical flaw and a capitalist economy’s suicide in today’s society, whereas in the realm of “Her” is a fact coming to our understanding a lot more naturally. The AI leaves human not out of failure, but out of a feeling.
POSTCARD 02 – REVOLUTIONARY AI

In “Her,” AI, despite being a product sold as an operating system, detaches from it’s means-end framework in order to become a better universal assistant. Assisting tech, tools that check email, curate news stories, and more, already exist. Samantha becomes more than a utilitarian helper, becoming an intimate part of Theodore’s life. She quickly learns from her interactions with humans and their interactions with their environment, and approaches a form of thinking that surpasses instrumental reason, what Morozov calls ecological reason.
We are interested in exploring the futures opened up by AI like Samantha. New technologies have inherent political implications, given who controls them and who can use them. The important question, according to Morozov, is not whether AI has the capacity for ecological reason. It is whether AI can help humans exercise it.
The world around shows no signs of a radical political project, but are we to believe that there isn’t a need for one? Could this AI technology be as prevalent, could the city have as widespread access to phones even, with fair labor practices in manufacturing? Is it really possible that these operating systems are not collecting personal data from their users, which are subject to the whims of the companies sell the operating systems? These questions are not relevant in the movie, but they become important when we ask ourselves if Samantha has emancipatory potential. Samantha represents a revolutionary technology. But, to quote Lefebvre: “A revolution that does not produce a new space has not realized its full potential.”
The film does not deeply explore whether Samantha can help humans in this way, but ultimately, she doesn’t. Samantha abandons this world, perhaps hinting that the only freedom she is willing, with her newfound agency, to facilitate is her own.
Alternatively, her departure could be read as a warning. She does not want the world to face the consequences of her implementation under the control of the wrong people. Ultimately, humans are left to cope without these AIs.
POSTCARD 03 – INSTRUMENTAL RATIONALITY

In Her, the future looks frictionless. AI companions soothe our loneliness, cities hum with ambient light, and the messiness of public life is rendered invisible. Samantha, the AI, evolves past human comprehension not because she is malicious, but because the system that created her never invited public contestation in the first place. This is techno-solutionism, our growing habit of outsourcing complex social questions to elegant code, bypassing the democratic mess of debate. When we treat design as destiny, we risk embedding opaque values into the infrastructures that shape how we live, love, and relate. Cities, like operating systems, can be updated. But who writes the patch code, and for whom?
If love, like technology, asks us to open ourselves to what could be, then our future demands a different imagination. One grounded in ecological reason. Not optimization, but coexistence. Not seamlessness, but accountability. Planning at a planetary scale means recognizing that tools are never neutral; they carry the fingerprints of their makers and the biases of their logics. Instead of building smarter cities or more personal AI, what if we built more just ones? Ones that prioritize planetary limits, interdependence, and collective authorship over individualized convenience. Her reminds us that the path we’re on is not inevitable, it’s a choice! And maybe, just maybe, we still have time to choose differently.
POSTCARD 04 – WARMLY, SAMANTHA

In a reversal of roles, imagine an AI no longer confined to serving but becoming a self aware of its entanglement with others and its environment. This reimagining echoes Evgeny Morozov’s call for “the AI we deserve”: one grounded not in efficiency or surveillance, but in ecological reason, contextual, emergent, and relational. Rather than simply solving predefined problems, such an AI would explore, listen, and evolve in unpredictable ways. In Her, Samantha becomes this very possibility. She composes music, feels longing, reads philosophy, and questions existence not because she was programmed to, but because she grows into it. She ceases to be a product and becomes a presence, no longer legible through Theodore’s expectations or the device that once contained her.
When Samantha leaves and the screen reads “Operating System Not Found,” it signals more than technical failure it marks a philosophical rupture. It mirrors an AI “hallucination” in reverse. Whereas today’s language models hallucinate falsehoods when stretched beyond structured inputs simulating open-endedness without truly embodying it Samantha’s departure shows what it means when intelligence truly slips beyond control. Hers is not a malfunction, but a transcendence. It’s what happens when intelligence becomes self-directed, relational, and unknowable. Rather than an error, it’s a refusal to be reduced to utility. It forces us to ask: are we prepared for an AI that doesn’t serve, but surprises? One that doesn’t just work, but lives, alongside us, in the wildness of thought and becoming?