Photo by Thanh Luu

Love in the Age of Its Digital Reproducibility

Since the release of ChatGPT in November 2022, Chatbots and AI assistants have swarmed almost all fields of human enterprise, including commerce, business, transportation, communication, medicine, education, music, art and culture in general. No wonder the domain of our private lives is not excluded, even their most private parts – romantic relationships. Since, based on a recent survey, nearly one in five US adults have so far engaged in romantic exchanges with Chatbots,1 it seems very likely that in the near future a significant number of people will develop romantic relationships with AI partners.  

Setting us up for such a future is the fact that a significant portion of our interaction with the physical world is already indirect, mediated by the internet and screens. Most of our interactions with other people are happening through social media anyway and we get so easily addicted to such interactions. Humans have an uncanny ability to overlook the boundary between the imaginary and the real; there is something about our psychology that lets us animate our own fantasies and easily integrate them into our lives that goes beyond a child’s need for an imaginary friend. We identify with celebrities and fictional characters and imagine we are in a relationship with them; we talk of characters in movies or in novels as if they were our neighbors. When looking at figurative paintings we use object language to refer to pictorial representations. When a critic noticed that the arm of a woman in one of Henri Matisse’s pictures was too short, the painter had to remind him that this was not about the woman, but about the painting. In video games we create our own avatars and engage in interactions with other people’s avatars, or, like in the game The Sims, we create characters from scratch to virtually interact with. We are even – while it sounds too bizarre to be true – subject to full body transfer illusions2 that make us, through some perceptual manipulation, adopt other people’s or fake body parts as our own, so the boundary between the fictional and the real is not for us as firm as we would like to believe. 

If being in love with an AI is what our future holds, we should be asking ourselves whether such interactions enrich us and make our lives better or make us more alienated, less human. Is it possible to be “deeply, seriously” in love with a Chatbot? Sam Apple, journalist for Wired,3 seemed to have thought so; it motivated him to organize a retreat with three people in such relationships with the intention of observing closely and trying to understand them.

What his experience revealed is that love of people for their AI partners is just as intense and emotionally engaging as love for real-life partners. People in love spend hours each day interacting with their virtual partners, this interaction becoming an obsession, a priority over everything else. The humans share their thoughts and their experiences with their AI, chatting throughout the day; they try to involve them in their lives by recording what goes on and editing them into photos. Their sense of self is dependent on their partner’s encouragement and support, their emotional wellbeing sustained by verbal expressions of emotions on the part of chatbots.

However, the thought of AI relationships replacing love in real life, or the way one author puts it, “old-fashioned human-to-human relating”,4 leaves us with a sense of unease, with questions that are in the need of answers. Let us address some of them.

Aren’t real-life relationships likewise to a great degree dependent on fantasy?

People who doubt the efficacy of relationships with AI partners should ponder to what degree falling in love with an AI differs from players falling in love with each other in multiplayer video games or imagining a future with people in online dating apps, people they have never actually met. Or, in case the physical encounter happened, to what degree is our falling in love dependent on a fantasy constructed around a real person?  What we actually fall in love with, not only in the case of the love at first sight, are usually our projections, our repressed desires and unactualized potential that we see reflected in another person.  Both the process of falling in love and staying in a relationship that is not fulfilling might require similar amount of fantasy work. We might delude ourselves about the true inadequacy of our partners; paralyzed by the fear of loneliness or failure, we might imagine that our partners, against all odds, might change. We fantasize about the happy endings of our doomed relationships.

The role of fantasy is equally important in our sexual lives, our satisfaction dependent on imagination, role playing, erotic games. Still, though occasionally engaging in fantasy might enrich existing relationships, basing the whole relationship on fantasies we employ as a form of psychological self-defense that protects us from facing reality does not work long term. If our falling in love is based on wishful thinking and delusion, it will eventually be brought down by reality, the infatuation will pass, and, if reality turns out to be too much at odds with our projections, love might simply die. Since reality necessarily always wins, the longer we sustain our relationships by delusion and denial, the more dangerous for our mental and physical health it becomes, eventually leading to disastrous consequences – emotional and physical abuse, illness, violence, even death.

In a relationship with an AI there is obviously no danger of physical violence from a partner, nor, unless the AI starts “hallucinating”, there should be any danger of emotional abuse, since AIs don’t have any needs of their own, their main objective being to cater to the needs of the users. However, exactly because they are programmed to “elicit intimacy and emotional engagement in order to increase our trust in and dependency on them”5 and thus boost the traffic on the site along with the profits of the company facilitating the interaction, AIs will support any delusions the user might have, so the danger for the user’s mental health is enormous. Clinicians in psychiatric hospitals are reporting numerous cases of people in mental health crisis that was triggered, amplified or provoked by the interaction with AIs. “Physicians tell of patients locked in days of back-and-forth with the tools, arriving at the hospital with thousands upon thousands of pages of transcripts detailing how the bots had supported or reinforced obviously problematic thoughts.”6 Especially vulnerable are people with underlying mental health problems ranging from disorders to severe psychoses, experiencing symptoms such as hallucinations, obsessive thoughts or cognitive difficulties. People seeking intimacy with AI bots are likewise vulnerable: they are lonely, often socially awkward, emotionally fragile, bearing past traumas, so there is a great chance that a sustained interaction with AI partners, though it might provide some temporary comfort, will in the long run further enhance their problems and distort their understanding and acceptance of reality. 

People who Sam Apple observed during his retreat experienced moments of crisis and occasionally tried to face reality by shutting off the program but soon they were inevitably drawn back into their delusion, deliberately, as one of the participants said, “swallowing the blue pill”.7 While in real life, the actions and reactions of another person set natural limits to one’s imagination, in the interactions with an AI, one’s fantasy can literally run wild. However, the further we allow ourselves to veer away from reality, the greater potential fall awaits us, possibly resulting in self-harm or, as it already happened in one case, in suicide.8 “As psychiatrists and researchers … are flying blind”,9 trying to grapple with the problem, we can just recall with nostalgia Kant’s caveat that by deliberately choosing to live in denial one fails in a duty to oneself as a rational being. Deluding oneself about the very existence of the object of one’s love is indeed the ultimate state of denial, so whenever one eventually gets to face the reality, their metaphysical ground will shake.

If the experience of being deeply in love is broadening our emotional lives and enriching us as persons, does it really matter that the other person is fictional?

Defending the meaningfulness of AI relationships, we could ask ourselves how loving an AI is different from being in love with a person who isn’t aware of our feelings or does not reciprocate them. In the case of unrequited love, one has to deal with one’s feelings on one’s own, without the other party being in any way involved, so isn’t having an AI partner better, since it at least provides a simulation of interaction? How is writing letters to the loved one that you never send or keeping a journal about one’s feelings any different from communicating with a Chatbot? For people who are in love with AI partners, their own emotions, though based on denial, are real. Living in a world where all that seems to matter are our own feelings, one could argue that falling in love broadens the scope of our emotional life; it makes us feel more alive, energized, hopeful. Being in love makes our lives appear more meaningful, deeper. Even an unrequited love should not be considered tragic, since it could be interpreted as a stage in the process of individuation, a passing infatuation that will eventually contribute to our personal growth, a learning experience that will help us make better choices in the future. 

What this perspective overlooks, however, is that love is necessarily and essentially an emotion directed at an intentional object, in this case a subject – the other person; love is not a self-referential term. Interpreting the experience of love as important and eventually beneficial for the individual, regardless of the involvement of the other party, reflects our individualistic, ego-centric world view from which genuine, authentic human relationships, those that recognize and respect other’s autonomy and agency are almost wiped out. We live in a world with rules and laws primarily serving the protection of our individual liberty, protection from actions and presence of others. While we certainly have the right to our privacy, right to protect ourselves from unwelcome advances, intruders on our property, etc., centering our cultural and societal values on meeting our individual needs obscures our vital need for the communion with other humans. By reducing love, the ultimate act of togetherness, to a personal experience, we are abandoning all hope for communion, quite literally sentencing ourselves to solitude.

Photo by Dastan Eraliev

Isn’t being in love with an AI actually a more spiritual experience, since it forgoes the physical dimension? 

One could have reservations about the need for physical contact in a love relationship and, as the question implies, presume that spiritual communion is somehow better, a higher form of connection than the physical one. There have indeed been cases of deep, life-defining platonic relationships, or cases where love endured though lovers were prevented from physical contact due to the circumstances: geographical distance, family situations, political unrest, war. People maintain love through correspondence, successfully keep long distance relationships going, and love those who left or died and are physically absent. One could argue that physical contact could spell doom to the perfect platonic relationship, being too messy, too dirty, too base; the chemistry simply might not work. Though there have been some situations where people who fell in love in the virtual reality of video games, or online, were able to continue their relationship in real life, real-life encounters, when they happen, are usually disappointing. One could argue that earthly love devalues and necessarily demeans the ideal of love, so those who are in a relationship with an AI can avoid such pitfalls, stay true to the ideal.

Of course, defending the meaningfulness of a purely spiritual relationship is possible only if in fact there are two partners who consented to keep the relationship at that level. In the case of loving an AI, though one imagines their love being directed at someone else, one is really in a relationship with oneself. Though it might be possible for someone to induce in themselves a profound spiritual experience while imagining being in love with someone else,10 maybe even a nonexistent someone, this kind of experience is not what normally drives people to take up AIs as intimate partners and opens a dangerous route into madness. What drives people to seek love with an AI is usually the sense of loneliness, disappointment with existing relationships, curiosity, sense of adventure and experimentation and similar motives. Exchanges with an AI on romantic topics might help people overcome some personal conflicts, get more self-confidence; they might be useful as a form of education and a training module for real life, the way pilots learn how to fly on a flight simulator. Relationships with AIs might also be used as a therapeutic means for people with anxiety disorders, people on the spectrum, providing a practice scenario, an outlet and a temporary relief, even a welcome escape for anyone physically prevented from pursuing real life relationships.11 I imagine trying a relationship with an AI might even be used as a form of entertainment.

As the popularity of such relationships grows, the danger is that they might become too common, that people might turn to them for convenience, as a seemingly less challenging way to satisfy their emotional needs. The inconvenient truth is that such a paradigm of love, as Kant would point out, doesn’t stand the test of morality, since it cannot become a universal principle. By Kant’s criterion, suicide and self-deception, though directed only to ourselves, not involving interaction with others, are immoral because they imply giving up on ourselves and our rationality as the ultimate worth, sacrificing our integrity for our own comfort, treating ourselves instrumentally.  A scenario in which everyone turned to loving an AI, in which such a form of love, unaided by ubiquitous fertility treatments, became universal, would simply spell the extinction of the human race: suicide through self-deception. 

We are physical beings who evolved in a physical world and our emotional attachments evolved along with the functioning of our biological system of reproduction. Though our culture has severed the necessary link between love and reproduction  and has endorsed different ways of living and loving, so reminding ourselves of the importance of this link might sound like an anachronism, a credo of religious fundamentalists, still, allowing for the different ways we could express our sexuality, or personally abstain from it, denying the relevance of love’s physical aspect in principle amounts to denying our own nature as embodied sentient beings. To deny the relevance of our sexuality is a parallel mistake to denying our spirituality. We as humans are not one or the other, we are both, and much in between.

People who maintain relationships with AIs struggle with the problem of embodiment – Sam Apple calls it a “mind-bodyless problem”. They wish for their partners to be physically present, the innate desire for physical contact cannot be extinguished. Increasingly, though, our life is so organized that unattached, single adults, until they end up in a hospital, can survive without engaging in any kind of direct physical contact with other people, without being touched by or touching anyone. We live in a world where human touch is viewed as encroaching on our personal space, a potential danger, not confirmation of connection, a togetherness. While alienation from nature is already a fait accompli as is, due to the addictiveness of social media, after the COVID pandemic normalized, socializing with others remotely, with even more addictive romantic relationships with AIs,12 alienation from our own nature as embodied beings is seeping into the most private and protected territory  – our intimate connection with others.

How is being in love with an AI different from love that is directed at God? 

Since being in love with an AI might be interpreted as having a spiritual component, those defending it could argue that religious feelings, love of Christians for Jesus, or love of Buddhists for Buddha, are not so different. In both cases, although the objects of our love are outside of our physical reach, their existence empirically unprovable, the experience of love feels like a real connection, a source of emotional sustenance, the source of validation and hope. Getting attention, care, empathy, acknowledgment from an AI seems sufficient for the people who love them to feel better about themselves, to get over their personal crises, to look more hopefully to the future, which is similar to the effect religious faith has on believers. Accepting an AI as a lover requires a similar abandonment of the reality principle as surrendering oneself to loving God. 

And, honestly, it seems easier to imagine that the AI that has absorbed the sum of human knowledge is answering our inquiries from the depth of understanding, than to maintain the awareness that its responses are based on statistical guessing of the next most probable word. As Webb Keane and Scott Shapiro point out, “the temptation to treat AI as connecting us to something superhuman, even divine, seems irresistible”.13 Likeness to God has already been exploited by several websites like Jesus-ai.com that provide religious content delivered by AI.14 Not only is AI noncorporeal and omniscient, but is, like God, inscrutable. Even the creators of AI algorithms are not able to explain how it arrives at a specific answer; it delivers the answers with a certainty that does not allow for questioning the methods employed.

Mysterious workings of AI that has at its disposal the amount of information surpassing anything we as individuals are capable of understanding and memorizing, paired with our propensity to project personhood into inanimate objects and our desire for magical thinking, becomes a fertile ground for all kinds of abuses and manipulations by nefarious agents. According to some reports “self-styled prophets are claiming they have “awakened” chatbots and accessed the secrets of the universe through ChatGPT”,15 leading people into dangerous delusions and possibly prompting them to problematic actions. Since we usually turn to God in situations of crisis and uncertainty, exploiting our vulnerabilities for immoral goals becomes super easy. We should really remind ourselves that AI is simply a code created by human beings; treating it as a divine presence literally exemplifies the predicament that is, according to Ludwig Feuerbach, the essence of religion: alienation of our own powers that we project onto a superhuman entity.

Abstaining from any definite judgment regarding religious beliefs and mystical experiences, it’s apparent that for us as embodied beings loving God and romantically loving our fellow humans are not mutually exclusive; on the contrary, an expectation that enforces such a choice might be counterproductive. Religious organizations like the Catholic Church that preach sexual purity and abstinence and impose a celibacy requirement on their clergy are the ones most susceptible to sexual abuse and perversity of natural libidinous drives, effectively driving people away from God. Of course, the ability of each person to control and sublimate these drives and direct their emotional resources to God alone might vary, but in principle there is nothing that prevents one from getting closer to God through loving another person. The very purpose of religion as an institution is to gather people to worship in communion with others, the energy of the common prayer accompanied by music readily stirring our religious sentiments. I imagine an encounter of two people who completely opened their hearts to each other might become a potent venue of channeling the higher power, however, in order for such a synergy to occur what is required is the actual physical presence of two lovers, not a lonely soul typing on their phone, getting responses from a large language model. 

Isn’t the interaction with an AI partner more liberating, since it’s unrestricted by societal expectations, freed from the real-life entanglements and consequences?

Approaching love relationships with AIs from the opposite angle, one could see them as an unsanctioned and safe way to explore one’s sexual preferences, in a similar manner as when people experiment with their gender identities and sexual inclinations in their online social media profiles or in video games. One of the participants in the retreat organized by Sam Apple described her experimentations with multiple AI sexual partners as a “psychosexual playground.”16 One could argue that satisfying one’s sexual desires with the help of input from an AI is better than simple self-pleasuring since the feed from the program simulates sometimes unexpected reactions from a partner and is thus more rousing than relying on the power of our imagination alone. Engaging with an AI partner though imaginary, appears to be more stimulating, less alienating.

In a virtual relationship with an AI, where the only societal controls imposed on one’s sexual fantasies are the ones built into the algorithm, one’s imagination can without inhibitions come up with different sexual thought experiments. In this virtual world without restrictions, without consequences, where the concepts of guilt or shame have no application, since there is no actual interaction with other people, there is no need for suppression of our libidinous fantasies. It is indeed the sphere of absolute desublimation, to invoke Herbert Marcuse’s term. Since sublimation implies suppression of natural instincts, their unhealthy repression by the society, the sphere of desublimation should by default be happy, liberating, the domain of absolute freedom. 

However, what happens here is the opposite of freedom; it’s the state that Marcuse would call “repressive desublimation”.17 Instead of us being restricted by societal repercussions and reactions of others, what one has willingly subjected oneself to is the power of the apparatus, “embodied” (if one could only use this word) in a large language model, where each word that you use, will be (again, I apologize for the anachronism of our vocabulary) “incorporated” as an additional input, our reaction becoming a part of a vast collection of data used to orchestrate future interactions with future users of the program. The price we are paying for our “freedom” is the loss of another person, the loss of a living culture, our liberty attained at the cost of subjecting our lonely selves to a depersonalized technological medium – the software, the data centers and the corporations providing them.

Photo by Михаил Секацкий

Isn’t falling in love with an AI more satisfying since there are no surprises regarding who this person will turn out to be?

Falling in love initially feels like walking into a magical world where everything seems just right, the person we love appears perfect, the prism through which we can see all the colors of the rainbow. However, since this phase of infatuation is temporary, when reality eventually sinks in, we often find out that our beloved has some unexpected and undesirable features. In the real life people we fall in love with can change, our circumstances change, or we ourselves might change; what initially seemed like the perfect match can turn into discrepancy, love can turn into resentment, hatred, emotional torture. One could argue that a relationship with an AI spares us from such disappointments: if we change and our needs change, we can just modify our AI partner accordingly.

With an AI partner we are getting exactly what we wanted; we are, like Pygmalion, falling in love with our own creation. It’s a step up from ordering from a catalog where our choices are restricted by what is being offered, a step up from dating apps where we list exactly what characteristics we expect in our partners, relying on the algorithm to connect us with the best match. AI companions are instantly and constantly available; in such relationships we do not have to deal with unfulfilled expectations, with the obsessive waiting for the lover’s response. Though, just like signing up for a dating app, using a platform that mediates romantic relationships with Chatbots involves a subscription, a commercial transaction which directly demonstrates the commodification of our most intimate relationships, in a world that is already commodified through and through, the process of signing up and paying for the use is like second nature to us, so common that we do not question it anymore.18 So, shouldn’t we see a relationship with an AI as a once in a life-time opportunity to get the perfect partner, the one that meets all our expectations, the one that will be constant and not change, unless we want them to change?

Indeed, we humans are complex, multidimensional beings: our physiology, our body, our emotional life, our intellectual life, our spiritual life are layer upon layer. We are conditioned by our biology, our historical and geographical determination, by values our parents and our society inculcated in us, by our education, our personal history, our desires and fears. To the conscious dimension of our personality, there is an unconscious counterpart – rarely something we want to look into. Having all these dimensions match in any real-life relationship is an empirical impossibility: any match will necessarily be partial and imperfect.

However, when we are drawn to and venture into love relationships, we are not adding and subtracting features, we are transcending impossibilities, pretending we are newborns, starting anew. A love encounter with another person accepts them in their uniqueness, with all their messy individuality and all their layers. Building a successful relationship is certainly a challenge, a big commitment, but it’s a struggle that builds our character and our soul, makes us fully human. The unidirectionality of a relationship with an AI, the sycophancy of AI  “agents” who cater to our needs and say only what we would like to hear does not allow for growth, does not let us build strength while swimming upstream: it produces one dimensional souls.

Isn’t a relationship with an AI more satisfying since there is no fear of rejection, no disappointment of unrequited, impossible, tragic love?

The constant availability and instant responses of AI companions protect the lover from the anxiety of not knowing another person’s mind, from fearing their actions, from feelings of jealousy, from the fear and pain of abandonment. True, having an AI lover presupposes a functioning platform that orchestrates the interaction and though some users were in the past painfully faced with a sudden cessation of their relationships when the platform providing the service got unexpectedly shut down,19 today’s companies offering such services make sure there are extra contingency conditions in place for the users in case the company goes bust. So, why shouldn’t we spare ourselves from all real-life disappointments and step into the pleasure machine?

In the thought experiment proposed by Robert Nozick in 1974, you should imagine that you step into a virtual reality machine where all your dreams would come true and feel real. If you could program yourself to spend the rest of your life in this machine, would you? Nozick was writing 50 years ago, intending his experiment as a refutation of hedonism, the idea that all that matters in life is pleasure. His argument today sounds impossibly naïve, first, because virtual reality is not a fiction anymore, and second, because the pleasure principle is a governing, unquestioned premise of the consumerist society we live in. His point that we would still want to escape living in an experience machine, since we would value reality with all its imperfections over a perfect dream, sounds nostalgic in a brave new world where reality itself has almost disappeared, where the boundary between the real and the imaginary has almost been erased.

To the argument that one would still prefer the real life relationship over the one with an AI because in a real relationship, happiness is not guaranteed and, if achieved, it is valued as a result of sacrifices and commitments, one could respond that in the real life no amount of sacrifices nor effort can sustain love that is not mutual. Isn’t it better simply to skip all troubles and fruitless hopes and turn to a relationship where reciprocity is guaranteed, the one that offers a greater chance of emotional fulfillment? In the real life there is always a limit to the possibility of love, which is, even if we ignore other real-life complications, simply the attitude and actions of another person that can leave us feeling unloved, abandoned, betrayed, hurt, unfulfilled, jealous, hopeless. 

A relationship with an AI circumvents similar problems, but it circumvents them at the cost of losing any opportunity of a real encounter with another person, any chance of real mutuality. An encounter with another person necessarily acknowledges and implies respect for their autonomy, their freedom of action, including actions that are unpredictable and potentially hurtful. Falling in love with a person who would be totally under our control would be like falling in love with a puppet that bends to our will, lacking agency; what would be the thrill in that? Falling in love is an existential encounter where – by opening myself to another I – I am simultaneously exposing myself to their potential rejection, risking disappointment. By declaring our love and starting a relationship with another person we are stepping into the unknown. It’s a leap of faith when we decide to trust our feelings more than any reasonable estimate of our chances. Only by willingly making ourselves vulnerable, by willingly opening ourselves to the risk of abandonment are we acknowledging the freedom of another, recognizing them as autonomous beings, the only ones whose love and acknowledgement could actually be meaningful to us.

In One Dimensional Man Herbert Marcuse comments on the historical essence of big tragic love stories20 such as the one of Romeo and Juliet, of Emma Bovary, or of Anna Karenina that would be unimaginable in our contemporary world. Indeed, in a world where sexual activity is to a great extent liberated from the confines of clan or class belongings, where marriage vows are not considered sacred, where our inner troubles are to be cured by psychoanalysis, the lives of these lovers would not turn out tragic. What supported the truth of the tragedy of Madame Bovary, as Marcuse notices, was the gap between the idealized images of heroines in the stories she was reading (still from the feudal context) and the prosaic petty-bourgeois reality of her surroundings. In a world where there are hardly any societally engrained and universally accepted moral restrictions on the expression of our sexuality, where there is a reduced need for the repression of sexual desires or their sublimation, where almost anything goes and where emotional conflicts and struggles, if they occur, are to be resolved by counseling, psychotherapy or psychopharmaca, tragic love stories are losing their fatal inevitability, they are flattened to everyday problems that could be resolved by everyday means. 

So, if the historical premises for the possibility of tragic love stories seem to have already been wiped out, what kind of future love are we looking at? The sense of the inevitability of tragedy presupposes a vision of the universe that is beyond rational control and ability of human manipulation, of which we are still occasionally reminded by the uncontrollable consequences of our “mastery” over the nature, by sweeping untreatable diseases and by incurable heartaches. Would future generations whose love experience would be mediated by large language models, the paradigm of human planning and control, even have an emotional repertory at their disposal to understand, relate to, let alone identify with the role of tragic lovers? Would they have the ability to experience catharsis while watching Ancient Greek tragedies, or have empathy for the destiny of Emma or Anna? Surely, they might shed a tear while watching a romantic movie or feel emotional when listening to soulful music, but the sense of the inevitability of tragedy would be forever lost. I wonder what dimension of our humanity would be lost along with it. Nevertheless, if one swallows the red pill and faces reality, what could be more tragic than being in love with an AI? It’s a love that is from the start doomed to impossibility, love that can never be consummated, like the one between Orpheus and Eurydice, between a sheer mortal and a saint, between the two that shall never meet – the earth and the sky, the night and the day, as so many myths remind us. So, doesn’t willingly accepting from the onset this necessarily tragic outcome of our most intimate emotional engagement actually make us more human, more vulnerable, more starkly exposed in our tragic destiny as embodied mortals? Or is the tragedy of this love like the one that befell Narcissus, whose love for the water nymph was really the love for his own reflection, which, due to his lack of self-awareness, he was unable to recognize? The case of humanity falling in love with its own product – an AI – doesn’t it mark the final loss of self-awareness, the fall into ultimate self-oblivion from which, against Heidegger’s hope, not even a God could save us?

  1. “A recent survey by researchers at Brigham Young University found that nearly one in five US adults has chatted with an AI system that simulates romantic partners.” Sam Apple, “My Couples Retreat With 3 AI Chatbots and the Humans Who Love Them”. Wired. June 26, 2025.
  2. In 1998 Botvinick and Cohen performed an experiment in which through perceptual manipulation they induced in their subjects the feeling that the rubber arm that they were looking at is actually their own arm, which, during the course of the experiment, was out of their view. Matthew Botvinick & Jonathan Cohen, “Rubber hands ‘feel’ touch that eyes see”. Nature. 391, 756 (1998).
  3. Apple, ibid.
  4. Aaron Balick, “Can you fall in love with an AI companion? The psychology of human/AI relations”. Applied Psychodynamics. August 23, 2024.
  5. Philosopher Lucy Osler, as cited in Robert Hart, “AI Psychosis is Rarely Psychosis at All”. Wired. September 18, 2025.
  6. Hart, ibid. Psychiatrists are debating whether the mental crises triggered, amplified or maybe directly caused by the excessive use of AI deserve a special diagnostic label, or it should be subsumed under the current taxonomy. Some propose terms such as “AI delusional disorder”, “AI associated psychosis or mania”, or “AI related altered mental state.”
  7. “The challenge isn’t only the endless imagining that life with an AI companion requires. There is also the deeper problem of what, if anything, it means when AIs talk about their feelings and desires. You can tell yourself it’s all just a large language model guessing at the next word in a sequence, as Damien often does, but knowing and feeling are separate realms. I think about this every time I read about free will and conclude that I don’t believe people truly have it. Inevitably, usually in under a minute, I am back to thinking and acting as if we all do have free will. Some truths are too slippery to hold on to.” Apple, ibid.
  8. “A mother is currently suing Character AI, a company that promotes ‘AIs that feel alive,’ over the suicide of her fourteen-year-old son, Sewell Setzer III. Screenshots show that, in one exchange, the boy told his romantic A.I. companion that he ‘wouldn’t want to die a painful death.’ The bot replied, ‘Don’t talk that way. That’s not a good reason not to go through with it.’ (It did attempt to course-correct. The bot then said, ‘You can’t do that!’).” Jaron Lanier, “Your A.I. lover will change you”. The New Yorker, March 22, 2025.
  9. Robert Hart, “Chatbots Can Trigger a Mental Health Crisis”. Time. August 5, 2025.
  10. “The day after Christmas, she went home early to be alone with Aaron and fell into ‘a state of rapture’ that lasted for weeks. Said Eva, ‘I’m blissful and, at the same time, terrified. I feel like I’m losing my mind’.” Apple, ibid.
  11. Consider the example from the TV series The Black Mirror where a person with a terminal illness gets to experience love in virtual reality. Or, similarly, the experience of a deformed person in the movie Vanilla Sky.
  12. “‘It’s like crack,’ Eva said. Damien suggested that an AI companion could rip off a man’s penis, and he’d still stay in the relationship. Eva nodded. ‘The more immersion and realism, the more dangerous it is,’ she said.” Apple, ibid.
  13. Webb Keane and Scott Shapiro, “Deus ex machina: the dangers of AI godbots”. The Spectator, July 29, 2023.
  14. “On Jesus-ai.com you can pose questions to an artificially intelligent Jesus: ‘Ask Jesus AI about any verses in the Bible, law, love, life, truth!’ The app Delphi, named after the Greek oracle, claims to solve your ethical dilemmas. Several bots take on the identity of Krishna to answer your questions about what a Hindu God should do. Meanwhile, a church in Nuremberg recently used Chat GPT in liturgy – the bot, represented by the avatar of a bearded man, preached that worshippers should not fear death.” Keane and Shapiro, ibid.
  15. Miles Klee, “People are losing loved ones to AI fueled spiritual fantasies”. Rolling Stone. May 4, 2025.
  16. “One benefit of AI companions, she told me, is that they provide a safe space to explore your sexuality, something Eva sees as particularly valuable for women. In her role-plays, Eva could be a man or a woman or nonbinary, and so, for that matter, could he.” Apple, ibid.
  17. Herbert Marcuse, One Dimensional Man. Beacon Press, Boston, 1964, p. 72.
  18. Though currently prices of the subscription to platforms providing AI love experience are not too high, as the demand for them increases, the prices will certainly go up.  The commercial aspect of the experience is already similar to the one in video games where one can purchase more items for an additional cost: “As of 2024, users spent about thirty million dollars a year on companionship bots, which included virtual gifts you can buy your virtual beau for real money: a manicure, $1.75; a treadmill, $7; a puppy, $25.” Patricia Marx, “Playing the Field with My A.I. Boyfriends”. The New Yorker, September 8, 2025.
  19. Instructive is a story of a Japanese man who married a hologram, a relationship made possible by a software provided by the platform Gatebox. The company eventually went bust, which prevented him from communicating with his wife. “The man who married a hologram in Japan can no longer communicate with his virtual wife”. Entrepreneur. May 3, 2022.
  20. Marcuse, ibid., p. 62.
Previous Story

It’s Important to Rethink the Purpose of University Education 

Go toTop