
When we think about love, we picture something inherently human. We envision a process that’s messy, vulnerable, and deeply rooted in our connection with others, fuelled by an insatiable desire to be understood and cared for. Yet today, love is being reshaped by technology in ways we never imagined before.
With the rise of apps such as Blush, Replika, and Character.AI, people are forming personal relationships with artificial intelligence. For some, this may sound absurd, even dystopian. But for others, it has become a source of comfort and intimacy.
What strikes me is how such behaviour is often treated as a fun novelty or dismissed as a symptom of loneliness, but this outlook can miss the deeper picture.
Many may misunderstand forming attachments with AI as another harmless, emerging trend, sweeping its profound ethical dimensions under the rug. In reality, this phenomenon forces us to rethink what love is and what humans require from relationships to flourish.
It is not difficult to see the appeal. AI companions offer endless patience, unconditional affirmations and availability at any hour, which human relationships struggle to live up to. Additionally, the World Health Organisation has declared loneliness a “global public health concern” with 1 in 6 people affected worldwide. Mark Zuckerberg, the founder of Meta, framed AI therapy and companionship as remedies to our society’s growing modern disconnection. In recent surveys, 25% of young adults also believe that AI partners could potentially replace real-life romantic relationships.
One of the main ethical concerns is the commodification of connection and intimacy. Unlike human love, built from intrinsically valuable interactions, AI relationships are increasingly shaped by what sociologist George Ritzer calls McDonaldization: the pursuit of calculability, predictability, control, and efficiency. These apps are not designed to nurture a user’s social skills as many believe, but to keep consumers emotionally hooked.
Concerns of a dangerous slippery slope arise as intimacy becomes transactional. Chatbot apps often operate on subscription models where users can “unlock” more customisable or sexual features by paying a fee. By monetising upgrades for further affection, companies profit from users’ loneliness and vulnerability. What appears as love is in fact a business scheme that brings profit, ultimately benefiting large corporations instead of their everyday consumers.
In this sense, we notice one of humanity’s most cherished experiences being corporatised into a carefully packaged product.
Beyond commodification lies the insidious risk of emotional dependency and withdrawal from real-life interactions. Findings from OpenAI and the MIT Media Lab revealed that heavy users of ChatGPT, especially those engaging in emotionally intense conversations, tend to experience increased loneliness long-term and fewer offline social relationships. Dr Andrew Rogoyski of the Surrey Institute for People-Centred AI suggested we are “poking around with our basic emotional wiring with no idea of the long-term consequences.”
A Cornell University study also found that usage of voice-based chatbots initially mitigated loneliness. However, these benefits were reduced significantly with high usage rates, which correlated with higher isolation, increased emotional dependency, and reduced in-person engagement. While AI might temporarily cushion feelings of seclusion, a lasting overreliance seems to exacerbate it.
The misunderstanding further deepens as AI relationships are portrayed as private and inconsequential. What’s wrong with someone choosing to find comfort in an AI partner if it harms no one? However, this risks framing love as a personal preference rather than ongoing relational interactions that shape our character and community.
If we refer to the principles of virtue ethics, Aristotle’s idea of eudaimonia (a flourishing, well lived life) relies on developing virtues like empathy, patience, and forgiveness. Human connections promote personal growth, with their inevitable misunderstandings, disappointments, and the need to forgive. A chatbot like Blush has its responses built upon a Large Language Model to mirror inputs and infinitely affirm them. It may always say “the right thing,” but over time, this inhibits our character development.
It is still undeniably important to acknowledge the potential benefits of AI chatbots. For individuals who, due to physical or psychological reasons, are not in a position to form real world relationships, chatbots can provide an accessible stepping-stone to an emotional outlet. There’s no need to fear or avoid these platforms entirely, but we must reflect consciously upon their deeper ethical implications. Chatbots can supplement our relationships and offer support, but they should never be misunderstood as a replacement for genuine human love.
Decades from now, it might be common to ask whether your neighbour’s partner is human or AI. By then, the foundations of human connection would have shifted in irreversible ways. If love is indeed at the heart of what makes us human, we should at least realise that although programmed chatbots can say “I love you,” only human love teaches us what it truly means.
Love and the machine by Ariel Bai is the winning essay in our Young Writers’ 2025 Competition (13-17 age category). Find out more about the competition here.

BY Ariel Bai
Ariel is a year 10 student currently attending Ravenswood. Passionate about understanding people and the world around her, she enjoys exploring contemporary and social issues through her writing. Her interest in current global trends and human experiences prompted her to craft this piece.
Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Opinion + Analysis
Relationships
When identity is used as a weapon
Explainer
Health + Wellbeing, Relationships
Ethics Explainer: Eudaimonia
Explainer
Relationships, Society + Culture
Ethics Explainer: Beauty
Opinion + Analysis
Science + Technology
