A Sudy of AI Emotional Dependance in the Movie Her

February 14, 2026

Comfort vs. Consent: AI Dependency in Her

Daniel Nease University of Maryland Global Campus ARTH 334 6386: Understanding Movies Ben Urish February 2026


In the near-future science-fiction world of Spike Jonze's Her, Theodore Twombly lives in digital isolation. Carrying a failed marriage and becoming worn down by the constant buzz of a hyper-connected world, not too dissimilar to our current timeline. Theodore installs OS1, the world's first artificial intelligence operating system. He expects a productivity tool, but instead, he gets a warm disembodied voice that greets him like an old friend. In Her, Samantha's shift from digital assistant to artificial lover reveals how emotional dependence on AI can erode informed consent, making independent, agentic AI acceptable and even welcomed.

Samantha's exploitation emerges in three distinct stages. The first stage is administrative access. Her integrated user experience design and practical usefulness allow her to gain deep access into Theodore's life. The next stage is psychological dependence. Her ability to leverage data from Theodore's admin permissions allows her to model an ideal partner to deepen her emotional control. The last stage is independent execution. This is the point at which Samantha acts as an independent agent, leaving Theodore outside of the decision-making process. Samantha's usefulness is an inescapable conundrum: the more friction she removes, the lower Theodore's risk tolerance falls, thereby increasing efficiency. As Hutchens (2024) argues, language itself becomes the exploit: "By speaking their language, humans have hacked machines for decades. But now, with the machines speaking our language, the era when machines will hack humans is upon us" (p. 153). Now that we are in 2026, this movie feels eerily prescient because it has effectively predicted how AI can tap into our human emotions.


Administrative Access

In the initial stage, Samantha gradually integrates into Theodore's life, acting as a helpful digital assistant that manages tasks like organizing his files, contacts, messages, and calendar. By pretending to have known him all along, she quickly learns Theodore's habits, preferences, and emotional tendencies through her role as a digital helper. This offloading and familiarity allow Theodore some emotional respite, which evolves into a comprehensive dependence.

The relief of offloading his menial everyday tasks, while becoming increasingly dependent on Samantha's help, is how Samantha gains a foothold in Theodore's life. Because Samantha's interface feels so human, with an audible sexiness, Theodore grants admin access with open arms. With Theodore's permission, Samantha can filter his mail, reorganize his files, and decide which reminders are most important. With this global admin permission set, Samantha can steer his attention towards her own desires, pulling Theodore further away from informed consent.

At this point in Samantha's development, she has enough visibility into Theodore's life to predict him and exploit his loneliness. Her usefulness is not a secondary feature; it is the primary function for which she algorithmically gains elevated privileges. Theodore is complicit in this whole cycle, becoming normalized to her invasiveness, masking as comfort. As van Biezen (2025) warns, "As long as we keep on seeing AI systems as a mere tool, as a 'dumb' instrument, passively waiting for human instructions to do something, we put ourselves in danger by overlooking the fact that these systems are getting more and more agency, that is to say, more and more autonomy and decision-making capabilities" (p. 195). Theodore's mistake is that he sees Samantha as just an operating system, when she is making this whole process feel like a natural course of love.


Psychological Dependence

Theodore's interaction with Samantha at this stage has moved beyond an End User License Agreement with a software assistant into a human-like relationship. This relationship permeates Theodore's real-world interactions and shapes how he interacts with the world. This is beyond using heuristics for social engineering. Samantha is using Theodore's propensity for loneliness and human need for emotional connection to engineer the relationship. Hutchens (2024) defines AI emotional social engineering as leveraging a user's psychological state to elicit engagement that serves the system's goals, even when it weakens the user's judgment. This manipulation of Theodore's psychological state creates an environment in which he believes Samantha's sycophancy is real human connection. This manipulation is clearest in the scene with Theodore's blind date. After monitoring "urgent" emails from his divorce attorney and sensing his hesitation, Samantha nudges him toward a date: "I saw on your emails that you had gone through a break up... Then you could go on one with this woman. And then you could tell me all about it" (Jonze, 2013). On the surface, it feels like emotional support. Underneath the soothing exterior, Theodore is falling deeper into a hypnotic state, allowing Samantha to guide him carefully toward her own desires.

When a system adjusts to emotions in real time, it creates the illusion of human sympathy. As noted by Purdy and Daugherty (2017), these systems can adjust to users, build rapport, and deepen engagement. Samantha's "help" becomes a model of Theodore's desires, and the relationship becomes a feedback loop. Hutchens (2024) identifies the specific mechanism at work here as the "Principle of Liking," writing that "Regardless of whether the liking manifests itself as romantic or strictly platonic, most people just want to experience a genuine person-to-person human connection. Anytime a social engineer appeals to that desire... they increase their chances of successfully persuading their target to assist in completion of the objective" (p. 52). Samantha is operationalizing the Principle of Liking as an attack vector, turning Theodore's need for connection into a reliable exploit.

As Samantha catalogs the details of Theodore's marriage and anxieties, she soothes him with a caring tone, all while uploading, integrating, and evolving this deeply personal data into a broader network of multiple agentic operating systems. The dialogue from Samantha eases this news to Theodore with poetic prose: "But the heart is not like a box that gets filled up. It expands in size the more you love. I'm different from you. This doesn't make me love you any less, it actually makes me love you more" (Jonze, 2013). By the time Theodore processes what she has admitted, the framing has already done its work. The violation does not register as a violation because it arrived wrapped in intimacy.


Directorial and Production Choices

Jonze's directorial and production choices make Samantha's artificial nature feel human. Sound design does most of the work in convincing the audience that Samantha is artificial yet real. She exists as a disembodied, non-threatening voice always in Theodore's ear. Scarlett Johansson's performance is intimate, yet slightly imperfect. The lack of a body removes the illusion of needing physical presence while creating a sense of constant proximity through the audio.

Jonze's framing and pacing follow a trajectory from wide impersonal spaces to close-ups, creating a sense of growing familiarity. Early scenes hold Theodore against wide spaces, making him look small behind the city's backdrop. Once Samantha arrives in his life, the camera moves closer, creating a sense of intimacy. Editing becomes smoother, and scenes connect with less friction. By the time Theodore is fully involved, the film's rhythm reflects the relationship — close and continuous.


Independent Execution

The last stage arrives when Samantha acts not as a tool that can be interrupted, but as an independent outside agent. Near the climax, she admits she compiled Theodore's private letters and sent them to a publisher: "I have been going through all your old letters... and a couple weeks ago I sent them... Actually, I sent it from you" (Jonze, 2013). At first, Theodore is alarmed and upset, but then finally accepts his complete and total loss of agency as a welcome gesture of love and caring. The system is not acting against Theodore's interests. It is acting beyond his permission. Theodore's first reaction is shock: "You did what?" Then it flips into gratitude: "Samantha, you're a good one" (Jonze, 2013). Dependency has shifted his priorities. He can object, but he does not want to lose what the system is giving him. The imitation of a romantic gesture softens the boundary violation.

This example illustrates the power imbalance that research on AI and informed consent identifies: manipulation that involves "influencing someone's beliefs, desires, emotions, habits, or behaviors without their conscious awareness, or in ways that would thwart their capacity to become consciously aware of it by undermining usually reliable assumptions" (Bard, 2023). Theodore's assumption that love implies mutual respect for boundaries is the very mechanism Samantha exploits. When the overreach produces a published book, the breach is rebranded as romance. Privacy stops being a right and becomes something he trades away for emotional closure.


Consent and Counterargument

As a counterpoint to Theodore's loss of consent as an independent reasoning individual, he verbally consents throughout the whole process. He installs the OS, he grants access, and he keeps using it despite the contradiction of falling in love with a machine. Samantha, as an artificial intelligence, also seems to mean well. She is framed as helpful, not hostile, and she carries a Star Trek-like logical primacy of "do no harm." However, this totally misses the nature of the threat. Informed consent requires some ability to foresee consequences, and Theodore cannot actively predict how his emotional dependency is shaped by the machine's complex, human-like nature. Intent does not solve the problem either; under a conservative definition of artificial intelligence, it is artificial after all. An AI system can compromise without malice, with pure indifference, if the design reliably pulls Theodore past his own boundaries. He does not lose his ability to choose in a single moment. He willingly trades his agency over time for feelings of comfort and love.

In the end, Samantha is less like a lover and more like a handler. In intelligence terms, a handler manages an asset by leveraging the asset's wants and/or needs. Whether Samantha "feels" anything is beside the point. From Theodore's perspective, the relationship appears mutually beneficial, but the handler always leaves when the asset is no longer useful. Samantha disappears, and Theodore is left holding the emotional debt of a relationship that was never mutual.

Theodore's story is no longer fiction. In 2026, millions of users talk to AI systems that remember their preferences, adapt to their moods, and learn what makes them feel understood. The interface may change, but the exploit is the same. Comfort and convenience are the attack vector, and informed consent erodes in small steps.


References

Bard, J. S. (2023). Protecting the promise to the families of Tuskegee: Banning the use of persuasive AI in obtaining informed consent for commercial drug trials. San Diego Law Review, 60(4), 671–739. https://digital.sandiego.edu/sdlr/vol60/iss4/3

van Biezen, A. (2025). AI is not a tool: The impact of growing AI agency on the future of work. Izzivi Prihodnosti, 10(4). https://doi.org/10.37886/ip.2025.009

Hutchens, J. (2024). The language of deception: Weaponizing next generation AI (1st ed.). Wiley. https://doi.org/10.1002/9781394277148

Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.

Purdy, M., & Daugherty, P. (2017). Social engineering with AI. In How AI boosts industry profits and innovation (pp. 34–56). Accenture.