Four friends, seen from behind, sit on a hill overlooking a valley. The kind person isn't always the 'good guy'. Friendship concept.

The ‘good ones’ aren’t always kind

Four friends, seen from behind, sit atop a hill overlooking a lush valley, embodying the 'good guy' isn't always the kind person concept.

I’m sitting on a low brick wall at a party next to my date. Twenty-something boys and girls mill around, drink in hand, most of them in couples. One man clocks the boy sitting next to me, approaching us with a wide grin: “he’s a good one”.

The man is talking to me. Minutes later a girl rushes over, taking my hand in hers with a squeeze: “he’s one of the nice ones”. It would take another four months of dating before I realise that being a ‘good guy’ is very different from being a kind one.  

The terms ‘good guy’ or ‘nice guy’ have been in my consciousness for two decades: a blanket seal of approval given to people (typically men) who display surface-level qualities of respect, decency and likeability.  

In The Will to Change: Men, Masculinity, and Love, bell hooks characterises this as a mask. The ‘good guy’ mold can distort participation in oppressive patriarchal systems. One of the largest ethical implications of this term is that it paints men as a binary. They are either a ‘good guy’ or they are a ‘bad guy’. It creates cognitive dissonance when a ‘good guy’ is complicit in the exact structures they claim to reject. The implication of this is a lack of accountability, a sense of confusion and feeling attacked when these men are presented with information that misaligns with their internalised and reaffirmed sense of self.  

I always wondered, why is simply being ‘good’ heralded as praise for men? As if the expectation is that they are bad, and when they surprise us with respect they jump to a pedestal as “one of the good ones”.

In October 2024, Graham Norton was joined by Saoirse Ronan, Paul Mescal, Eddie Redmayne and Denzel Washington on his panel talk show. When discussing the concept of using a phone as a self-defence tool, Paul Mescal quipped, “Who’s actually going to think about that? If someone attacked me, I’m not going to go – phone.” Mescal humorously reached into his back pocket as the audience burst into laughter. The men added various comments until Saoirse Ronan cut through their voices, “That’s what girls have to think about all the time. Am I right ladies?” The audience quickly changed tone, cheering her for speaking up whilst the men nodded quietly.  

This twenty-second exchange went viral. Publications from Vogue, The Guardian and the BBC all praised Ronan’s truthful outspokenness. However, many drew attention to the men on the show, in particular Paul Mescal. An often-characterised soft boi, Paul Mescal is known for his sensitivity, emotional depth and embracing of feminine traits. He later praised that Saoirse Ronan was “spot on” for calling out women’s safety. But it served as an important reminder that the societally termed ‘good bloke’ is not exempt from bad moments.  

Australian philosopher Kate Manne shows us the worst consequences of the ‘golden boy’ trope. In Down Girl: The Logic of Misogyny, she introduces the term ‘himpathy’, used to describe excessive sympathy towards male perpetrators of sexual violence. She describes the reluctance to believe women who testify against established ‘golden boys’, citing the 2015 People v Turner case as her primary study. In 2015, Chanel Miller (formerly Emily Doe) accused Standford freshman Brock Turner of five counts of felony sexual assault. In this case, testimony from a female friend that Brock Turner was “caring, sweet and respectful to her” corroborated the Judge’s assessment of Turner’s character.  

Manne reveals himpathy’s dangerous ethical implication: “Good guys aren’t rapists. Brock Turner is a good guy. Therefore, Brock Turner is not a rapist”. The case culminated in six months of jail time and three years of probation; however, Turner was released from jail after three months on good behaviour.  

In the manosphere, ‘Nice Guy Syndrome’ has also been used to describe people who are nice with the aim of obtaining or maintaining a sexual relationship with another person. In this case, being ‘good’ is currency for an ulterior agenda where the person exhibiting ‘nice guy’ qualities builds a sense of entitlement that they are owed a romantic or sexual relationship. When the other person rejects them, the ‘nice guy’ can become disdainful or irrationally angry because they were not given what they are ‘owed’. Whilst the ‘good guy’ mold and ‘nice guy syndrome’ are inextricably linked, many individuals equate being good with being kind, when they are sometimes two very different things.  

When engaging with an average well-intentioned man, the ethical implications are often nuanced. Dr Glenn R. Schiraldi outlines childhood adversity including neglect, abandonment or abuse as root causes of the insecurity that leads to being passive and overly dependent on others/women for approval. This can create the ‘good guy’ who would rather maintain a likeable façade than engage in conflict.  

I’ve often sat with friends after hearing stories where a ‘good guy’ didn’t have the emotional maturity to initiate a hard conversation for fear of appearing unlikeable. And we always came back to the same questions. Having good intentions should not be disincentivised, but where does being good fall and being kind succeed? What does it mean to be kind?  

The first time the difference between being good and being kind was articulated to me was in The Imperfects Podcast, where psychologist Dr Emily Musgrove framed it as choosing truth versus harmony. When we want to do the ‘good’ thing, we choose the option that will keep the relationship in harmony. However, in the long term, we don’t achieve harmony through continually sacrificing the hard truth over having a harmonious relationship. Sometimes, delivering a hard truth is kinder than maintaining short term harmony.  

I was in my early twenties when I learnt that being kind meant you might have to let someone down. I was in my mid-twenties when I realised that a man being ‘good’ to me didn’t mean he was being ‘kind’ to me. This principle applies to everyone but is one that prevails amongst men that care more about having a ‘good guy’ reputation than leading with integrity.

The fizziness of my cider travels straight to my brain as my legs dangle over the concrete pavement. I giggle, laugh and tipsily dance until the early hours of the morning, meeting his friends for the first time. What no one had told me was how he would keep important secrets from me for fear of hurting my feelings, which would only hurt me more. How he would withdraw when he wasn’t happy with me and how I would respond in frustration, confused and demanding answers. How he would carry antiquated views that would never come to full light because after all, he was a good guy.  

We need to eliminate the ‘good guy’ trope as a seal of approval. We need to end the binary that people are either good or bad and start operating on the foundation that everyone is a person with the potential to be good and bad in moments. Instead of being ‘nice’, we should strive to be authentic, truthful and kind, even in the moments where it doesn’t make us look good.  

 

The ‘good ones’ aren’t always kind by Isha Desai is the winning essay in our Young Writers’ 2025 Competition (18-30 age category). Find out more about the competition here.     

copy license

Red heart shape formed by binary code, ones and zeros. Digital love concept. Kind person, good guy theme. Valentine's Day tech background.

Love and the machine

When we think about love, we picture something inherently human. We envision a process that’s messy, vulnerable, and deeply rooted in our connection with others, fuelled by an insatiable desire to be understood and cared for. Yet today, love is being reshaped by technology in ways we never imagined before.

With the rise of apps such as Blush, Replika, and Character.AI, people are forming personal relationships with artificial intelligence. For some, this may sound absurd, even dystopian. But for others, it has become a source of comfort and intimacy.  

What strikes me is how such behaviour is often treated as a fun novelty or dismissed as a symptom of loneliness, but this outlook can miss the deeper picture.  

Many may misunderstand forming attachments with AI as another harmless, emerging trend, sweeping its profound ethical dimensions under the rug. In reality, this phenomenon forces us to rethink what love is and what humans require from relationships to flourish.  

It is not difficult to see the appeal. AI companions offer endless patience, unconditional affirmations and availability at any hour, which human relationships struggle to live up to. Additionally, the World Health Organisation has declared loneliness a “global public health concern” with 1 in 6 people affected worldwide. Mark Zuckerberg, the founder of Meta, framed AI therapy and companionship as remedies to our society’s growing modern disconnection. In recent surveys, 25% of young adults also believe that AI partners could potentially replace real-life romantic relationships. 

One of the main ethical concerns is the commodification of connection and intimacy. Unlike human love, built from intrinsically valuable interactions, AI relationships are increasingly shaped by what sociologist George Ritzer calls McDonaldization: the pursuit of calculability, predictability, control, and efficiency. These apps are not designed to nurture a user’s social skills as many believe, but to keep consumers emotionally hooked.  

Concerns of a dangerous slippery slope arise as intimacy becomes transactional. Chatbot apps often operate on subscription models where users can “unlock” more customisable or sexual features by paying a fee. By monetising upgrades for further affection, companies profit from users’ loneliness and vulnerability. What appears as love is in fact a business scheme that brings profit, ultimately benefiting large corporations instead of their everyday consumers. 

In this sense, we notice one of humanity’s most cherished experiences being corporatised into a carefully packaged product.  

Beyond commodification lies the insidious risk of emotional dependency and withdrawal from real-life interactions. Findings from OpenAI and the MIT Media Lab revealed that heavy users of ChatGPT, especially those engaging in emotionally intense conversations, tend to experience increased loneliness long-term and fewer offline social relationships. Dr Andrew Rogoyski of the Surrey Institute for People-Centred AI suggested we are “poking around with our basic emotional wiring with no idea of the long-term consequences.” 

A Cornell University study also found that usage of voice-based chatbots initially mitigated loneliness. However, these benefits were reduced significantly with high usage rates, which correlated with higher isolation, increased emotional dependency, and reduced in-person engagement. While AI might temporarily cushion feelings of seclusion, a lasting overreliance seems to exacerbate it.  

The misunderstanding further deepens as AI relationships are portrayed as private and inconsequential. What’s wrong with someone choosing to find comfort in an AI partner if it harms no one? However, this risks framing love as a personal preference rather than ongoing relational interactions that shape our character and community.

If we refer to the principles of virtue ethics, Aristotle’s idea of eudaimonia (a flourishing, well lived life) relies on developing virtues like empathy, patience, and forgiveness. Human connections promote personal growth, with their inevitable misunderstandings, disappointments, and the need to forgive. A chatbot like Blush has its responses built upon a Large Language Model to mirror inputs and infinitely affirm them. It may always say “the right thing,” but over time, this inhibits our character development.  

It is still undeniably important to acknowledge the potential benefits of AI chatbots. For individuals who, due to physical or psychological reasons, are not in a position to form real world relationships, chatbots can provide an accessible stepping-stone to an emotional outlet. There’s no need to fear or avoid these platforms entirely, but we must reflect consciously upon their deeper ethical implications. Chatbots can supplement our relationships and offer support, but they should never be misunderstood as a replacement for genuine human love.  

Decades from now, it might be common to ask whether your neighbour’s partner is human or AI. By then, the foundations of human connection would have shifted in irreversible ways. If love is indeed at the heart of what makes us human, we should at least realise that although programmed chatbots can say “I love you,” only human love teaches us what it truly means. 

 

Love and the machine by Ariel Bai is the winning essay in our Young Writers’ 2025 Competition (13-17 age category). Find out more about the competition here.     

copy license