Thought experiment: The original position

Thought experiment: The original position
ExplainerSociety + CulturePolitics + Human Rights
BY The Ethics Centre 20 MAR 2025
If you were tasked with remaking society from scratch, how would you decide on the rules that should govern it?
This is the starting point of an influential thought experiment posed by the American 20th century political philosopher, John Rawls, that was intended to help us think about what a just society would look like.
Imagine you are at the very first gathering of people looking to create a society together. Rawls called this hypothetical gathering the “original position”. However, you also sit behind what Rawls called a “veil of ignorance”, so you have no idea who you will be in your society. This means you don’t know whether you will be rich, poor, male, female, able, disabled, religious, atheist, or even what your own idea of a good life looks like.
Rawls argued that these people in the original position, sitting behind a veil of ignorance, would be able to come up with a fair set of rules to run society because they would be truly impartial. And even though it’s impossible for anyone to actually “forget” who they are, the thought experiment has proven to be a highly influential tool for thinking about what a truly fair society might entail.
Social contract
Rawls’ thought experiment harkens back to a long philosophical tradition of thinking about how society – and the rules that govern it – emerged, and how they might be ethically justified. Centuries ago, thinkers like Thomas Hobbes, Jean-Jacques Rousseau and John Lock speculated that in the deep past, there were no societies as we understand them today. Instead, people lived in an anarchic “state of nature,” where each individual was governed only by their self-interest.
However, as people came together to cooperate for mutual benefit, they also got into destructive conflicts as their interests inevitably clashed. Hobbes painted a bleak picture of the state of nature as a “war of all against all” that persisted until people agreed to enter into a kind of “social contract,” where each person gives up some of their freedoms – such as the freedom to harm others – as long as everyone else in the contract does the same.
Hobbes argued that this would involve everyone outsourcing the rules of society to a monarch with absolute power – an idea that many more modern thinkers found to be unacceptably authoritarian. Locke, on the other hand, saw the social contract as a way to decide if a government had legitimacy. He argued that a government only has legitimacy if the people it governs could hypothetically come together to agree on how it’s run. This helped establish the basis of modern liberal democracy.
Rawls wanted to take the idea of a social contract further. He asked what kinds of rules people might come up with if they sat down in the original position with their peers and decided on them together.
Two principles
Rawls argued that two principles would emerge from the original position. The first principle is that each person in the society would have an equal right to the most expansive system of basic freedoms that are compatible with similar freedoms for everyone else. He believed these included things like political freedom, freedom of speech and assembly, freedom of thought, the right to own property and freedom from arbitrary arrest.
The second principle, which he called the “difference principle”, referred to how power and wealth should be distributed in the society. He argued that everyone should have equal opportunity to hold positions of authority and power, and that wealth should be distributed in a way that benefits the least advantaged members of society.
This means there can be inequality, and some people can be vastly more wealthy than others, but only if that inequality benefits those with the least wealth and power. So a world where everyone has $100 is not as just as a world where some people have $10,000, but the poorest have at least $101.
Since Rawls published his idea of the original position in A Theory of Justice in 1972, it has sparked tremendous discussion and debate among philosophers and political theorists, and helped inform how we think about liberal society. To this day, Rawls’ idea of the original position is a useful tool to think about what kinds of rules ought to govern society.

Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Big thinker
Politics + Human Rights
Big Thinker: Hannah Arendt
LISTEN
Health + Wellbeing, Business + Leadership, Society + Culture
Life and Shares
Explainer
Politics + Human Rights
Thought Experiment: The famous violinist
Opinion + Analysis
Health + Wellbeing, Politics + Human Rights, Relationships
CoronaVirus reveals our sinophobic underbelly

BY The Ethics Centre
The Ethics Centre is a not-for-profit organisation developing innovative programs, services and experiences, designed to bring ethics to the centre of professional and personal life.
Ask an ethicist: Am I falling behind in life “milestones”?

Ask an ethicist: Am I falling behind in life “milestones”?
Opinion + AnalysisSociety + Culture
BY Cris Parker 11 MAR 2025
Over the past couple of years I’ve noticed all my friends are either getting married, buying houses or starting families. I haven’t achieved any of these major life “milestones” yet and am worried I won’t any time soon. Should I be concerned I’m not keeping up?
Big life milestones – getting married, having kids, or buying houses are often seen as markers for success and there is no right way to go about them, if at all. It is a deeply personal choice influenced by individual values and principles, goals, and the society we live in.
In Western societies, there is often an unspoken expectation that by your mid-30s, you should have ticked off a checklist: education, marriage, kids, and a mortgage. But where do these expectations come from? And do they really reflect what we want for ourselves?
It’s easy to feel like you are falling behind when everyone around you seems to be hitting milestones at a kicking pace. Social media does not help – our feeds are filled with engagement announcements, baby photos, and housewarming parties, making it seem like these achievements are happening all the time. These “deadlines” are often arbitrary, and the pressure to achieve them can often come from unidentifiable places.
Beyond social media, family, culture, and tradition also play a significant role in shaping our expectations. Parents and grandparents often see marriage, kids, and home ownership as the natural and successful progression into adulthood. Friends, too, may unintentionally add pressure by assuming you’ll follow the same path they did – moving to the suburbs, starting a family, or planning a big wedding. These pressures can subtly reinforce the idea that there is a “right” path to follow.
Traditions can be comforting, offering a sense of structure and belonging. But they can also feel restrictive if they don’t align with your personal values. The key is to recognise that while traditions may have worked in the past, they don’t have to dictate your choices today.
Adding the external pressure, biases also shape how we interpret success and progress with the frequency of seeing these milestones online, reinforcing an availability bias where we may overestimate the prevalence of these “achievements”. False consensus bias can also make us assume that the choices of our friends represent a universal societal norm.
Let’s remember these milestones are not set in stone and differ from generation to generation. Societal norms are developed from the economic and social conditions of the time. A generation or two ago, settling down young was the norm. After World War II, single-income households could afford homes, birth rates were high, and religious beliefs strongly influenced family life.
But the world today is very different. Housing prices have skyrocketed, wages haven’t kept up, and societal values have shifted which mean these traditional milestones are no longer as achievable or even desirable as they once were.
The shift away from traditional milestones reflects broader changes in societal values. Within younger generations today, there is greater emphasis on personal fulfilment, career development, and experiences such as travel and education. These evolving values redefine what we think of as success and happiness. The focus on individual agency allows people to make choices that align with their personal goals and values rather than societal pressures.
The decision to have children has become increasingly complex. Many people grapple with the ethical implications of bringing children into a world facing environmental crises and geopolitical unrest. Add financial pressures and a growing lack of trust in long-term stability, and the decision to become a parent requires careful consideration and a strong alignment with personal values. It underscores the importance of being authentic and accountable for our decisions, as these choices reflect not just our desires but our hopes and concerns for the future.
At the end of the day, what really matters is whether your choices align with your own values and aspirations. Ethically, it’s important to question whether you are pursuing these milestones because they genuinely excite you, or because you feel like you ‘should’? There’s no right or wrong answer – just the one that feels right for you.
Success isn’t about checking off a list of expectations – it’s about living in a way that is aligned with what makes you happy and fulfilled. Whether that includes marriage, kids, homeownership, or something entirely different, the most important thing is that it’s your choice – not society’s.


BY Cris Parker
Cris Parker is Head of The Ethics Alliance and a Director of the Banking and Finance Oath.
Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Opinion + Analysis
Relationships, Society + Culture
The Bear and what it means to keep going when you lose it all
Opinion + Analysis
Society + Culture
Ask an ethicist: Is it OK to steal during a cost of living crisis?
Opinion + Analysis
Society + Culture
‘Vice’ movie is a wake up call for democracy
Opinion + Analysis
Politics + Human Rights, Relationships, Society + Culture
You won’t be able to tell whether Depp or Heard are lying by watching their faces
Freedom of expression, the art of...

Freedom of expression, the art of…
Opinion + AnalysisSociety + CulturePolitics + Human Rights
BY Brook Garru Andrew 5 MAR 2025
The creative mind is a vast universe of ideas, emotions, and experiences expressed through music, painting, poetry, and beholden to the forces of politics, history, and memory.
Creation is never neutral; it is shaped by ethics, challenged by censorship, and navigates evolving taboos. Artists share an impulse to express, question, and challenge shifting realities. Their work requires not only careful creation but also thoughtful engagement from society.
Martinican writer and philosopher Édouard Glissant explored identity as fluid, shaped by culture, history, and human connections. He introduced concepts like creolisation – the blending and evolution of cultures – and relation, which emphasises deep interconnectedness between all things. Glissant argued that true understanding does not require full transparency; rather, opacity allows for complexity and respect. His ideas remain essential in a time when artistic expression faces increasing scrutiny.
Art that challenges dominant narratives has historically faced repression. Artists have been censored, exiled, or condemned for confronting power structures. Dmitri Shostakovich was denounced by Stalin’s Soviet regime, Fela Kuti was imprisoned for his politically charged lyrics, and Buffy Sainte-Marie was reportedly blacklisted for her activism against war and colonial oppression. These cases highlight the ongoing tension between creative expression and societal control.
The questions remain: how can we create spaces for open debate that accommodate differing perspectives without deepening divisions? Can artistic expression be challenged and discussed without escalating polarisation? These ethical considerations are central to the evolving relationship between art, power, politics, and public discourse. Glissant’s vision of the world as a relational space offers a framework for understanding these tensions. We will not always agree and accepting difference is an opportunity for growth, to find ways to work together and create the world we want to be in.
Australia, despite its colonial past which it is still coming to terms with, has the potential to embrace this openness. It has long been a refuge for those fleeing war and persecution, and its commitment to artistic independence could foster healing and truth telling. This was achieved in 2024, when Archie Moore transformed the Australian Pavilion at the Venice Biennale with kith and kin, a genealogical mural that created a space of memorial to address Aboriginal Deaths in Custody. His work compelled viewers to confront colonial violence and won the prestigious Golden Lion for Best National Participation.
Our ability to embrace openness; however, is continually tested. In February 2025, Creative Australia selected Khaled Sabsabi to represent the country at the 2026 Venice Biennale. However, his appointment was quickly rescinded due to controversy over some of his past works. Amid heightened tensions – especially following both recent antisemitic and Islamophobic incidents in Australia – these early works ignited debate over the appropriateness of his selection to represent Australia.
Born in Tripoli, Lebanon, in 1965, Sabsabi migrated to Australia in 1978 to escape the Lebanese civil war. Now a leading multimedia artist, his work explores identity, conflict, and cultural representation. He does not seek provocation for its own sake but, in his words, “humanity and commonality” in a fractured world. The decision by Creative Australia and the unwillingness to consult with the artistic team will have consequences yet to be revealed.
Sabsabi’s removal was not solely about his art but also about a reluctance to engage with the complexity of his personal and artistic journey. The debate was quickly reduced to sensationalism and those seeking political gain. As of February 2025, major conflicts persist internationally including in Myanmar, Israel-Palestine, Sudan, Ukraine, Ethiopia, Afghanistan, Pakistan, and West Papua. These crises underscore the responsibility of countries like Australia to uphold international law, human rights protection, and open dialogue. Yet many in Australia today are hesitant – or even fearful – to publicly express their views on these crises, particularly in writing.
In this global landscape, artists play a vital role in documenting realities, challenging dominant narratives, and fostering open dialogue. Yet, such actions are becoming increasingly difficult. Perspectives are scrutinised, and the line between advocacy and bias is often blurred. Artists, journalists, and commentators must navigate these complexities with integrity, acknowledging multiple truths while resisting ideological pressures.
The Venice Biennale, one of the world’s most prestigious art exhibitions, has long been a site where these tensions surface. The Giardini della Biennale, its historic heart, houses 29 national pavilions, mostly belonging to First World nations. Established in 1895, the Biennale reflects global cultural and political hierarchies. Wealthy Western countries dominate, while many nations lack permanent pavilions and must rent temporary spaces. This structural imbalance highlights broader geopolitical tensions, where artistic representation remains deeply entangled with power, privilege, and exclusion.
When an artist like Sabsabi is removed from such a space, it is not just censorship – it is a lack of engagement with the world as it truly is: complex, layered, and shaped by histories of displacement and resilience. At a time when war, genocide, colonial legacies, and cultural erasure remain pressing concerns, his exclusion is a loss not only for Australia but for the ideals of artistic dialogue and exchange the Biennale is meant to uphold.
This moment underscores the need for genuine artistic and cultural engagement in a world in distress. Art is not just about provocation; it fosters dialogue and solidarity. While opinions on Sabsabi’s work may differ, his artwork is an expression of his lived experience. His exploration of difficult truths and diverse viewpoints underscores that nothing remains static.
The role of the artist is to show us parts of the world that we may not, or refuse to, see. It does not mean that we must agree with their viewpoint.
The broader concern is what this decision reveals about Australian values, how we as individuals and institutions engage with art, and the nation’s commitment to artistic freedom. It raises questions about transparency and decision-making in cultural institutions, but also how cultural dialogue is shaped by media, journalists, and politicians to inform public opinion.
As Australia navigates this complex moment, the challenge lies in balancing artistic freedom with public accountability. The outcome of this controversy may influence future cultural policies, shaping the country’s approach to representing its artists and how art should be viewed. Whether the withdrawal is seen as necessary caution or a loss to Australian artistic identity remains an open question.
In a world marked by ideological clashes, religious wars, and present and historical trauma, there are no absolute winners, with many caught in cycles of division. Growth begins with listening and as Glissant teaches, identity and expression cannot be confined to binaries. Complexity and contradiction invite deeper understanding. A just society must make room for difficult conversations and challenging artistic expressions.
Image: Khaled Sabsabi portrait, Unseen, 2023, image courtesy the artist and Mosman Art Gallery, © Mosman Art Gallery. Photograph: Cassandra Hannagan
Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Opinion + Analysis
Relationships, Society + Culture
Meet Joseph, our new Fellow exploring society through pop culture
Opinion + Analysis
Politics + Human Rights, Relationships, Society + Culture
Film Review: If Beale Street Could Talk
Opinion + Analysis
Relationships, Society + Culture
The problem with Australian identity
Explainer
Business + Leadership, Politics + Human Rights
Ethics Explainer: Liberalism

BY Brook Garru Andrew
Brook Garru Andrew is an artist, curator and writer who is driven by the collisions of intertwined narratives. His practice is grounded in his perspective as a Wiradjuri and Celtic person from Australia. Brook is Director of Reimagining Museums and Collections at the University of Melbourne and Research Fellow at the Pitt Rivers Museum Oxford, and is represented by Tolarno Galleries, Melbourne, Ames Yavuz Gallery, Sydney/Singapore/London, and Galerie Nathalie Obadia, Paris/Brussels.
When our possibilities seem to collapse

When our possibilities seem to collapse
Opinion + AnalysisPolitics + Human RightsSociety + Culture
BY Simon Longstaff 24 FEB 2025
In recent years, I have had an increasing number of conversations with people who feel overwhelmed by world events.
They describe the world as wracked by multiple crises of a magnitude such that efforts to reform or repair what is broken is simply futile. This attitude is not confined to those who are poor, oppressed or marginalised. This particular feeling seems to be divorced from personal circumstances – with even the privileged experiencing the same grim outlook.
Especially troubling is the fact that this feeling has been building, despite all of the evidence that Earth has become, on average, a safer, more peaceful place. A decreasing percentage of people live in abject poverty. Infant mortality rates are declining. There are fewer famines and more to eat. As a whole, the ‘four horsemen of the Apocalypse’ have been left playing cards in the stable.
Of course, specific individuals and groups still suffer the full panoply of ills that can afflict the human condition. But, overall, things have been getting better at the same time that people have come to believe that the world is a place of deepening doom and gloom.
One can easily think of an explanation for the divergence between reality and perception. First, is the power of individual stories. A compelling narrative of one person’s suffering can easily be taken as representative of the whole. It’s the same general phenomenon that leads us to feel better whenever we encounter a narrative in which the ‘underdog’ triumphs over the more powerful rival. These figures are swiftly ‘universalised’ to the point that any one of us can see our own possibilities reflected in that of the protagonist. So, when our stories tilt from the positive to the negative we are likely to see our view of the world take on a pessimistic hue.
This brings us to the second reason for this negative outlook. Unfortunately, the media (in all its forms) has ‘discovered’ that ‘bad news’ sells. Or to be more precise, bad news attracts and retains an audience in a way that positive news does not. The implications of this ‘insight’ have been grossly magnified by the emergence of social media – with its insatiable drive for eyes, ears, minds, etc. One of the most depressing stories I have heard, in recent years, is of a publisher that monitors every second of a story’s online life. The moment interest begins to flag, I am reliably advised, the instruction is to ‘bait the hook’ with something that implies ‘crisis’. So, what might properly be considered ‘troubling’ is elevated to the point where it feels like an existential threat posing a risk to all. Few remember the details when (almost inevitably) the more mundane truth emerges. What remains is the feeling – and facts rarely disturb established feelings.
These factors (and a host of others – too numerous to mention here) have combined to help generate a creeping sense that one might as well disengage and let the world go to hell in the proverbial ‘handbasket’. Let’s not bring children into a failing world. Let’s limit our attention to things within our span of control (not much). Let’s not vote – it’s pointless …
The deeply disturbing thing about all of this is that the temptation to withdraw because of the perceived futility of engagement gives rise to a self-fulfilling prophecy grounded in that great enemy of democracy; despair.
The truth of democracy (and markets for that matter) is that even the least powerful individual can change the world when working with others. Remarkably, we don’t even have to plan and coordinate for collective action to have a powerful effect. Indeed, one need not be ‘heroic’ (in the ‘Nelson Mandela’ sense) to be an author of momentous events. It’s just that you are unlikely ever to know when your decision or action tilted the world on its axis. Often it just requires a person to fall just a smidgen on the right side of a question. And when enough people do the same, there is progress.
Despair is the absence of hope that this is possible. It is a deceit that makes us fail – not because we never could succeed but because it destroys our belief in the possibility that we can be effective agents for change.
If we succumb to despair, then our possibilities collapse – not because we are powerless but because we believe ourselves to be.
Australia should aspire to be one of the world’s great democracies. It’s in our bones – dating back to South Australia that, in the late nineteenth century, brought into the world the first modern democracy in which all adults could not only vote but take up public office – irrespective of sex, gender, race or creed. For years, the secret ballot was known as the “Australian Ballot’. In 1900, Australia held more patents per head of population than any other country in the world. Yes, there were all the problems of colonisation, racism, sexism and a plethora of wrongs. And yes, we went backward for a while after Federation. But that reality did not give rise to despair.
Things actually can get better. Democracy is the vehicle for doing so; not because it demands so much of us as individuals – but because it demands so little. We just need to allow ourselves a small measure of hope to defeat despair. We need to take just a small, individual step forwards. Do this together, and nothing can stop us.

Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Big thinker
Politics + Human Rights
Big Thinker: Malcolm X
Opinion + Analysis
Politics + Human Rights, Business + Leadership
Political promises and the problem of ‘dirty hands’
Opinion + Analysis
Politics + Human Rights
Is every billionaire a policy failure?
Opinion + Analysis
Politics + Human Rights
A note on Anti-Semitism

BY Simon Longstaff
After studying law in Sydney and teaching in Tasmania, Simon pursued postgraduate studies in philosophy as a Member of Magdalene College, Cambridge. In 1991, Simon commenced his work as the first Executive Director of The Ethics Centre. In 2013, he was made an officer of the Order of Australia (AO) for “distinguished service to the community through the promotion of ethical standards in governance and business, to improving corporate responsibility, and to philosophy.”
Does your body tell the truth? Apple Cider Vinegar and the warning cry of wellness

Does your body tell the truth? Apple Cider Vinegar and the warning cry of wellness
Opinion + AnalysisHealth + WellbeingSociety + Culture
BY Joseph Earp 19 FEB 2025
Of all the snake oil salespeople who have dominated the wellness space, few have been as destructively, unsettlingly committed to the bit as Belle Gibson.
For a while, Gibson made her name on her own alleged suffering. A social media influencer, author, and wellness personality, she claimed to have been diagnosed with a horrifying assortment of cancers, from tumours in her brain, to tumours on her liver.
More than that, she claimed that these cancers were responding not to mainstream medicine, but to a wholefood diet. Gibson, a young parent based in Melbourne, was essentially arguing that she had found one of the cures for cancer.
Of course, she hadn’t. When it became clear that Gibson was not passing on some of the money that she had made from her successful cookbook and app to charity, as she promised, her tangle of lies began to fall apart. In fact, Gibson had never been diagnosed with cancer, as she admitted in a trainwreck interview with 60 Minutes, half an hour of television so actively unsettling that it is still seared into the mind of many of the Australians who watched it.
But even then, even when challenged, she never quite admitted the truth. In one memorable moment of the 60 Minutes episode, restaged in the recent Netflix miniseries Apple Cider Vinegar, Gibson is asked to define what “truth” is.
Her mumbly non-answer was taken as evidence that Gibson couldn’t stop lying, even when asked to respond to even the most basic of questions. But her inability to give a clean definition of truth cuts right to the heart of the wellness space – and, hopefully, teaches us how to stay clear of its worst impulses.

The whole truth
Traditional medicine and alternative therapies are constantly lobbied by the likes of Gibson and other wellness influencers, who are suspicious of the “objectivity” offered by mainstream medicine. Gibson, and other traditional medicine advocates like disgraced chef Pete Evans, point to science’s flaws – to the way that medicine messes up. This, they say, is proof that mainstream medicine can’t answer all our problems.
But these critiques of science are not just the domain of wellness influencers – they are also common in post-modern and pragmatist philosophy.
In his book Against Method, Austrian philosopher Paul Feyerabend threw chaos into the established notion of the scientific truth. According to Feyerabend, there is no such thing as the “scientific method” – no way to make the multitude of different ways to enact science appear streamlined and “correct.”
By contrast, Feyerabend advocated for an “anything goes” principle of scientific enquiry, arguing that scientists should follow whatever research paths seem particularly interesting to them.
According to him, science is not some way to get to the heart of the matter – to understand what is definitely, objectively true. Instead, he sees medicine as a way of understanding the world, an art form in the same way that painting a picture, or writing a poem, is a way of understanding the world.
In this way, Feyerabend echoes the work of pragmatist philosopher Richard Rorty. Rorty argued that there is no viewpoint that exists outside of context – no way to get “more objective”. We can’t ever see past who we are and where we are in time. We are constantly mired in context. A “viewpoint outside history” does not exist. Science isn’t a mirror that can be better polished, and eventually represent the world exactly as it is. The most we can hope for is some sort of established consensus, a bunch of subjective viewpoints that align with each other, rather than an objective reality.
Rorty and Feyerabend’s arguments do have some genuine pull to them. It is correct that science is always changing – that discoveries that seemed “objectively true” ten years ago get replaced and reordered. It is also correct that medicine mucks up, and that many are disenfranchised from mainstream forms of science because of the way it uses claims of objectivity like a cudgel, silencing all dissenters. As in, “this is true, and anything that goes against this is false.”
But if we agree with the likes of Feyerabend and Rorty, and do away with the mainstream “truth” provided by science, are we stuck with the likes of Belle Gibson? Does throwing away objectivity mean that we must put up with the liars, scammers, and fraudsters? Does it mean that Gibson really does have the same claim to the “truth” – a truth she could not even name?
The healing nature of balance
The short answer is, of course, no. If questioning the scientific method really did mean that Gibson, who definitely, plainly, simply lied – and made a great deal of cash off those lies, and the perpetuated suffering of those who believed her – then that questioning would be patently dangerous.
But, as with so many things, the answer lies in the acceptance of balance. We do not have to treat medicine as a new form of religion, with iron-clad rules that we dare not ever question. Nor do we have to completely, constantly reject all of its findings, and put up absolute falsehoods in their place.
Rorty was never advocating for the likes of Gibson. Even though he undid some of the foundations of objectivity, he was not arguing that we can do whatever we like, or believe whatever we like. One of his most important and practical ideas, as mentioned above, was the idea of consensus. Even if we take the postmodern approach that “objectivity” is a shaky concept, we can create a picture of what consensus is that gives us all the things we like about objectivity.
As in, consensus can be reached by scientists and doctors. These specialists have dedicated their lives to uncovering the best ways to heal and help our bodies. When enough of these specialists find a common way to heal and help, then we have reached consensus. Consensus has the force of objectivity – it has reasons to explain itself. It’s not just random. It’s not Gibson just making things up. It’s a way of saying, “this works, and a lot of people agree that it works.”
What separates this picture of consensus from “objectivity” and its potential harms, moreover, is that consensus can change. When new discoveries are made, they can be shared around specialists, who can alter what they believe. It’s not that they were “wrong” before. It’s that consensus is malleable, changeable, and ever in-flux.
Belle Gibson couldn’t define what the truth is. She used that messiness to exploit people, and to cause harm. But we can ever-so-slightly release our hold on objectivity, without becoming Gibson. And in doing so, we can embrace a modern medicine that does what it was meant to do: help people.

Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Opinion + Analysis
Health + Wellbeing, Relationships
The myths of modern motherhood
Opinion + Analysis
Health + Wellbeing, Relationships
Should parents tell kids the truth about Santa?
Opinion + Analysis
Health + Wellbeing, Relationships
How to pick a good friend
Opinion + Analysis
Relationships, Society + Culture
Greer has the right to speak, but she also has something worth listening to

BY Joseph Earp
Joseph Earp is a poet, journalist and philosophy student. He is currently undertaking his PhD at the University of Sydney, studying the work of David Hume.
The right to connect

The right to connect
Opinion + AnalysisHealth + WellbeingSociety + Culture
BY Marlene Jo Baquiran 28 JAN 2025
Gen Zs are terrifying: we are the wizards of the digital world that is increasingly alien to the older generations.
We are hooked up by the veins to the ever flowing current of information and all of its chaos and simulacra, often seen as being emblematic of technology gone too far; escapists subsumed to a Matrix-like existence.
The other terrifying thing about us: we’re all mentally ill, apparently. So much so that Jonathan Haidt released a book about us called “The Anxious Generation”, expanding on the most popular explanation for rising mental illness in youth: the infamous ‘phone bad’ argument.
This claim has undeniable legitimacy: studies correlate social media usage with mental illness, including rising self-harm ER admissions for American teenage girls. There is no world in which a teen scrolling TikTok in their bed after a long day at school can defend their attention from a trillion-dollar organisation of world-class optimisers. Our generational psyche has been crafted into a consumer mind by a minority of technocrats.
In short, our generation has been raised into addiction. Understandably, this problem is now starting to drive policy, underlying a social media blanket ban for youth, as is being proposed by the Australian government.
I don’t think I would miss social media necessarily. But when I entertain what it would look like to close the portal to that void world, I do wonder: what then would I build on that empty plot?
My addict attention-span, whittled down to the duration of a good reel, attempts to ground itself firmly into the real world: OK, I’ve wasted so much time. I should focus on work — earning money. Money for what? Perpetual rent, until I can double my salary to buy a house. A house for what? My future family. But it feels unfair to have kids in this future, anticipating worsening climate disaster. Can I even manage a family when I can barely manage my own life? Or, well, this is kind of exhausting… I should relax… I’ll just quickly check Instagram to see if my friend sent something funny…
The checking is an itch. I scratch it compulsively when distracted, anxious, bored, or depressed, thereby gaining short-term relief. But like other types of addiction, while the object of addiction makes it worse, it is not the same as the source. Psychiatrist Gabor Mate’s mantra is: “Don’t ask why the addiction, ask why the pain.”
Perhaps my train of thought seems overly catastrophic — the kind of hysteria to be expected from the ‘anxious generation’. But the content of these anxieties driving addiction has real, material legitimacy, and they are not arbitrary. A home, a community, and a safe climate: these are frequently cited themes driving Gen Z’s anxieties.
They are recurring themes because they are how we connect to the outside world. Johann Hari wrote that the opposite of addiction isn’t sobriety: it’s connection. A physical home is the cradle of our connection to a place; a safe base. A community and friends who know and care for us is our connection to people, allowing us to develop fluid and dynamic social identities. A safe climate represents the connection to any foundational future; a promise that our actions today are and will continue to be meaningful. Therefore, these basics are also our critical defences against addiction — or more precisely, they are the salve for the part of us that is drawn to the escape it offers.
When basic needs for connection have become unattainable fantasies, it is less mysterious why mental illness in youth may not be easily solved by unplugging.
Home ownership is a pipe dream rather than a mundane right in an absurdly land-rich city like Sydney. Neoliberal market logic reigns supreme over the provision of basic needs, treating houses as a class of speculative investment assets over functional shelter. Alan Kohler notes house prices have increased from four to eight times annual income in one generation. This is not just an aberrant property of Australia, but a global crisis as pointed out by the World Economic Forum.
This alienation is worsened by looming climate disasters which can uproot entire homes, as it does so for many people in rural communities and the global south, with its reach to the rest of the world only being a matter of time. Yet while three-quarters of Australians aged 18-29 believe climate change is a pressing problem, only ~50% of Australians aged over 50 do.
The epidemic of loneliness is also a global epidemic, according to the World Health Organisation. The average person has fewer and fewer friends, with Gen Zs feeling more lonely than other generations, including those aged 65 and over. On the flipside, young people increasingly engage in unsatisfying forms of intimacy like non-committal relationships (situationships) and AI personas such as Character.AI — forms of commoditised connection which, similar to social media, are exacerbating symptoms of disconnection rather than drivers.
There is a clear inequality in the opportunities to build physical, social and lasting connections between young people now and the older generations who also comprise most positions of power. Without recognising this structural unfairness, it is too easy to imagine mental illness as a linear consequence of technology, which incidentally has the unfortunate side effect of hysterical social complaints.
But our ‘hysteria’ is Janus-faced: yes, young people may be ungrounded, overstimulated and sensitive — but we are also flexible, intensely curious, and sensitive. Behind our grief is a quiet hope that needs to be understood and nurtured — never belittled. It is correct but incomplete to say that our generation is overly anxious and negative: many Gen Zs often through social media write letters to MPs, courageously reaching for environmental justice. When understood and armed with the right tools for connection and change, our sensitivity is a source of strength.
Fixing ‘phone bad’ with a blanket ban will not fix the root causes of mental illness, particularly for a generation with so few existing tools for coping and connection. Experts on digital media fear that a social media ban will further isolate already-isolated youth, and with social media being a nexus for youth activism, it will only serve to silence youth voices further. Practically speaking, this legislation has also failed to work in other countries. Fundamentally, the escapism of the digital world is driven by the hostility of the real world towards youth.
What the older generations owe us, then, is to form solutions that sincerely integrate the experiences of youth to treat the structural sources of illness: the lack of Gen Z’s physical security, social connection and a future. Technology should be regulated through a framework of harm reduction, recognising its role as an object of addiction.
Lastly, they should recognise that our visible vulnerability is not the sole problem but reflective of the system’s failures; in the age of climate collapse, we are canaries in a coal mine. That does not make us weak, but protectors of what is soft. And if we are no longer willing to create structures to protect the soft, then what, exactly, do we create our society for?
This is one of the Highly Commended essays in our 2025 Young Writers’ Competition over 18s category.

Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Opinion + Analysis
Health + Wellbeing, Politics + Human Rights, Relationships
People with dementia need to be heard – not bound and drugged
Opinion + Analysis
Business + Leadership, Society + Culture
Access to ethical advice is crucial
Opinion + Analysis
Society + Culture
Ask an ethicist: Am I falling behind in life “milestones”?
Opinion + Analysis
Relationships, Society + Culture
Meet Joseph, our new Fellow exploring society through pop culture

BY Marlene Jo Baquiran
Marlene Jo Baquiran is a writer and activist from Western Sydney (Dharug land), Australia. Her writing focuses on culture, politics and climate, and is also featured in the book 'On This Ground: Best Australian Nature Writing'. She has worked on various climate technologies and currently runs the grassroots group 'Climate Writers' (Instagram: @climatewriters), which won the Edna Ryan Award for Community Activism.
Big Thinker: Epicurus

Epicurus (341–270 BCE) was an ancient Greek philosopher and founder of the highly influential school of philosophy, Epicureanism.
In a time dominated by Platonism, Epicurus established a competing school in Athens known as “the Garden”. Many of his teachings were direct contradictions of the teachings of Plato, other schools of thought and generally accepted ideas in areas like theology and politics. He also flouted norms of the time by openly allowing women and slaves to join and participate in the school.
Though he strongly insisted otherwise, dubbing himself “self-taught”, records indicate Epicurus was greatly influenced by many philosophers of and before his time such as Democritus, Pyrrho and Plato.
Theology and Ethics
Two significant departures from the popular ancient Greek thought involved Epicurus’ ideas about theology. In general, he was critical of popular religion, though in a much more restrained way than his later followers.
One departure was his thoughts on the afterlife. Epicurus believed that there was no afterlife, and that any belief in it, especially the idea that the afterlife could involve punishment and suffering, was a harmful superstition that prevented people from living a good life.
“Accustom thyself to believe that death is nothing to us, for good and evil imply sentience, and death is the privation of all sentience; . . . Death, therefore, the most awful of evils, is nothing to us, seeing that, when we are, death is not come, and, when death is come, we are not.”
From Letter to Menoeceus
Another departure was his thoughts on the gods, and specifically their involvement in human affairs, also known as divine providence. Unlike most, Epicurus believed in the gods while simultaneously believing that they were completely removed from the mortal realm and uninvolved in human affairs.
The Epicurean view of the gods was that they were perfect beings, and involvement in anything outside their perfection would tarnish that perfection. The view denies that they exhibit any control over humans or the world, and that they instead function mainly as aspirational figures – beings to admire and emulate.
Both of these departures from popular ancient Greek religion came from his unique hedonistic philosophy. Epicurus believed that the ultimate purpose of philosophy was to achieve, and help others to achieve, certain states of being that characterise the eudaimonic (happy, flourishing) life.
Specifically, he thought that eudaimonia was attained through internal peace (ataraxia), an absence of pain (aponia) and a life of friendship.
This was the basis of his hedonism – unique in that he defined pleasure as an absence of suffering and so had a far greater focus on moderation than what is typically associated with hedonism.
In fact, Epicurus was disapproving of excessiveness generally:
“…the pleasant life is produced not by a string of drinking bouts and revelries, nor by the enjoyment of boys and women, nor by fish and the other items on an expensive menu, but by sober reasoning.”
These ideas informed his theological thoughts. He believed that a belief in the afterlife was superstitious and bad because it was most often a source of fear. Instead of acting morally to avoid the risk of punishment in the afterlife, which causes suffering in the current life, Epicurus taught that we should instead act morally because we will inevitably suffer from guilt or the fear of being discovered if we do not.
Likewise, he taught that while the gods had no interest in the affairs of humans, we should still act morally and kindly because those who do will have no fear, leading to ataraxia.
“…it is not possible to live pleasurably without living sensibly and nobly and justly.”
Pleasure and Desire
As part of his ethical teachings, Epicurus noted different types of pleasures and desires that prevent us from achieving a life free from suffering and trouble.
He was particularly focused on desire because he saw it, like fear, as a ubiquitous source of suffering. He taught that there are three kinds of desires: two natural, and one empty. Natural desires can be necessary (like food or shelter) or unnecessary (like luxury food or recreational sex). Empty desires on the other hand correspond to no genuine, natural need and often can never be satisfied (wealth, fame, immortality), leading to continuous pain of unfulfilled desire.
Unfortunately, like many Hellenistic philosophers, the vast majority of Epicurus’ writings have remained undiscovered, instead pulled together largely through the writings of contemporaries, followers and later historians. Despite this, his ideas have resurfaced throughout the centuries since and influenced the thinking of ancient and modern philosophers alike.

Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Opinion + Analysis
Society + Culture
We are pitching for your pledge
Opinion + Analysis
Climate + Environment, Politics + Human Rights, Relationships, Society + Culture
The youth are rising. Will we listen?
Opinion + Analysis
Relationships, Society + Culture
Meet Daniel, helping us take ethics to the next generation
Opinion + Analysis
Society + Culture
Lisa Frank and the ethics of copyright

BY The Ethics Centre
The Ethics Centre is a not-for-profit organisation developing innovative programs, services and experiences, designed to bring ethics to the centre of professional and personal life.
David Lynch’s most surprising, important quality? His hope

David Lynch’s most surprising, important quality? His hope
Opinion + AnalysisSociety + Culture
BY Joseph Earp 17 JAN 2025
David Lynch’s Blue Velvet – the film that turned an outsider auteur into something approaching a genuine cultural sensation – is, even after all these years, a hard watch.
The film sets up its thesis almost immediately. A montage of quaint images of small-town life, all blue skies and white picket fences, is disrupted by tragedy – the father of the film’s hero, Jeffrey Beaumont (Kyle MacLachlan) collapses in his yard, killed by a heart attack. The man’s dog laps at the hose clung in his dead, tight hands. And then Lynch’s camera, still exploring, does something both beautiful and terrible: it burrows under the soil, where a nightmarish cacophony of insects forage.
So there it is, Blue Velvet’s message – that beneath Americana, with its bright smiles and cups of coffee, lies horror. From that starting point, rape, torture and abuse abound, as Jeffrey’s voyeurism sends his path crashing into the orbit of Frank Booth (Dennis Hopper, in one of cinema’s most terrifying performances, all gritted teeth and mummy issues). We watch Dorothy Vallens (Isabella Rossellini) get tortured in various ways; we see Jeffrey beaten and humiliated. And perhaps most unsettlingly of all, we come to realise that the likes of Frank, a personification of pure evil, are more plentiful than we might ever want to believe. Even, if not especially, here, where the skies are blue.

The world is wild at heart and weird on top
Much has been made of this quality found in the films of Lynch, who died today after a battle with emphysema – the contrast between ethical virtue, and deep ethical horror. Throwing up these two disparate forces – good and evil – was the modus operandi of Lynch’s one-of-a-kind career.
Twin Peaks, his hit television show, unraveled the angelic exterior of murdered teenager Laura Palmer, and pit her against another of Lynch’s satanic figures, the supernatural drifter Bob. Wild at Heart, one of the more underrated films he ever made, plunged a loving young couple, Sailor and Lula, into an impossibly evil world. And The Elephant Man, a black-and-white muted howl of pain, saw a man with disabilities try to find hope amongst objectification and cruelty.

But what is not often discussed about Lynch is that he did find beauty, time and time again. Contrary to what some have written, Lynch’s veneer of smiles and blue skies wasn’t some ironic posturing, established merely to make the horror more horrifying. Other filmmakers have untangled the way that evil thrives in darkness, out of sight – that’s not what made Lynch special.
Lynch’s power – his genius, even – is that he believed fully in both of the forces that make the world what it is, the darkness and the light.
This is a rare sort of ethical dialectics: rare both in art, and in our personal lives. Believe too deeply in the evil of the world, and you will simply never get out of bed. Ignore that evil, and strive forward as though it isn’t there, and you will fall prey to an ignorance that will make you a poor ethical actor. The real trick in all things is to understand that truth, if we ever find it, exists in the middle of extremes.
Hence the model of the archetypal Lynch hero: Dale Cooper (MacLachlan, again), the handsome, profoundly odd detective hero of Twin Peaks. Dale has a goofy, almost unrepentant enthusiasm. He loves coffee; he loves pie; he loves the town of Twin Peaks. He’s all broad smiles, and dorky thumbs up, perpetually grinning to the small town residents that come to love him. But this optimism doesn’t exist in spite of the darkness of the world – it exists because of it.
That understanding is expressed through his deep affection for Laura Palmer, the dead young woman he never met. The more he learns about Laura, believed by the town to be the perfect all-American girl, the more he loves her, even as he comes to see the precise shape of her demons.

Lynch’s lesson is contained here. Cooper doesn’t choose to believe in the goodness of people at the expense of acknowledging their capacity for great harm. He understands that the world is built, in many ways, for cruelty to flourish; for abusers to thrive, for casual unkindness to go unremarked upon. And he also understands that, surprisingly, time and time again, human beings will decide to love each other.
The art life
This complicated optimism was also at the heart of Lynch’s deeply inspiring life outside of filmmaking. Like Cooper, Lynch was a famous lover of little treats, the kind of tiny slithers of goodness that aren’t just a distraction from the world – they are the world. There are countless memes of Lynch expounding the beauty of a good cup of coffee; enjoying two cookies and a Coke in the back seat of a car; and, perhaps most movingly of all, speaking lovingly of the importance of what he called “the art life.”
For Lynch, the art life was painting, thinking, and making things with your hands. “I had this idea that you drink coffee, you smoke cigarettes, and you paint, and that’s it”, Lynch said once, happily. There is horror in this world, but being an artist isn’t just an aesthetic choice – it’s an ethical one. Being an artist means being curious; looking; creating.
Rather than being swamped by the inexorable downward slide of humanity, the artistic life allows one to see the things that make us, at the end of the day, so blessed. So loved.

Blue Velvet contains that hope in its final scene. After taking a long drive through hell, the film wraps up not with an image of suffering or pain. Instead, one of its last shots is a robin sitting on a branch. A robin – that most ordinary of birds, so small as to be invisible, but a symbol, built up over the course of the film, representing love itself. “Maybe the robins are here,” Jeffrey says, cautiously. But, despite it all, they are.

Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Opinion + Analysis
Business + Leadership, Society + Culture
Access to ethical advice is crucial
Opinion + Analysis
Society + Culture
Rethinking the way we give
Opinion + Analysis
Society + Culture
The Festival of Dangerous Ideas has been regrettably cancelled
WATCH
Society + Culture
Stan Grant: racism and the Australian dream

BY Joseph Earp
Joseph Earp is a poet, journalist and philosophy student. He is currently undertaking his PhD at the University of Sydney, studying the work of David Hume.
Discomfort isn’t dangerous, but avoiding it could be

Discomfort isn’t dangerous, but avoiding it could be
Opinion + AnalysisSociety + CultureRelationships
BY Emma Wilkins 7 JAN 2025
If I were a reviewer at a writers’ festival and I spotted an author whose work I’d praised – but also criticised – I’d be tempted to look the other way.
But, far from avoiding Christos Tsiolkas at last year’s Canberra Writers’ Festival, literary critic and Festival Artistic Director, Beejay Silcox chose to share the stage with him.
When asked if she felt uncomfortable, Silcox said that because she felt she’d done her job well – reflecting on the work, dealing with it on its own terms, choosing her words carefully – she didn’t. If she couldn’t be honest about “the loveliest man in Australian literature”, she didn’t deserve her job.
She did, however, describe that job as “inherently uncomfortable”. It’s the reason people often call her “brave”. But Silcox doesn’t share their view. “If what I do counts for bravery in our culture, we are f*cked,” she told the audience. “I know what bravery looks like; I’ve seen brave people. I’m just being honest.”
Point taken. There’s a difference between being honest and being brave. Honesty might require bravery, but the words aren’t interchangeable. If we use words like “bravery” too readily, we broaden its definition and reduce its potency.
It was the expanded application of certain words, that led psychology professor Nick Haslam to coin the term “concept creep”. More than a decade ago, he started noticing the widespread adoption of certain psychological terms in non-clinical settings was broadening people’s conceptions of harm. In a recent ABC interview, he said more expansive definitions of terms like “abuse” and “bullying” have had clear advantages, such as making it easier to call out bad behaviour. But mistakenly framing an unpleasant experience as “trauma”, or speaking as if ordinary worries constitute anxiety disorders, can make people feel, and become, more fragile.
This year, Australia introduced legislation that gives employers a positive duty to protect workers from psychosocial hazards and risks. It’s right to recognise that psychosocial harm can be as damaging as physical harm, but it’s important to understand what psychological safety isn’t. As Harvard business school professor Amy Edmondson stresses, it isn’t “feeling comfortable all the time”. It isn’t simply safety from discomfort, it’s safety to engage in conversations that might be uncomfortable. A manager should be able to gently raise an issue with an employee that might make them feel uncomfortable, without being accused of deliberately “violating” their safety – and vice versa. But I’ve heard managers, and educators, express concern that even well-intentioned, carefully delivered, feedback, could be perceived as an attack.
We can be uncomfortable and still be safe. If we lose this distinction, if managers in workplaces and teachers in schools, parents in the home and politicians in parliament, feel obliged to keep everyone comfortable all the time, we’ll end up in dangerous territory. We’ll be less able to express our views, and less able to hear each other out, less able to learn from each other.
Honest, measured, criticism plays an important role in society. We need to value it; even (perhaps especially) if it’s hard to hear. We also need to recognise that avoiding exposure to any and all discomfort will only heighten the sensation; and consider the merit, case by case, of facing it.
So, what might this look like?
C.S Lewis said that if you look for truth, “you may find comfort in the end”, but if you look for comfort, “you will not get either comfort or truth”. He wasn’t talking about performance reviews or reports or assignments, but the principle is still relevant.
If I seek the truth about my performance at work, I might find ways to improve it, gaining competence and confidence. A by-product will be a broader comfort zone. If I only want to hear praise, I can look for people who will only give it or find ways to only get it. I can avoid trying if I think I’ll fail, or I can start cheating to ensure success. In the short-term I might feel better, but over time, I’ll be progressively worse off.
We need more expansive conversations – more discomfort which, through exposure, will expand our comfort zones and increase our resilience. Silcox talks about how we need fewer “gatekeepers” – those who close doors and shut down important conversations – and more “locksmiths” – those who open them.
We can start by considering the way we speak, and think, and act. Are we using words that make others more cautious, risk averse, fearful, fragile, than they need to be? What about when talking to ourselves? Are we so focussed on staying safe from risk, that we forget about how many risks are safe to take? With this in mind, we can resolve to sit with discomfort, to recognise the doors that it can open, that comfort can’t. We can welcome feedback, even ask for it; embrace challenges, even set them for ourselves; replace complete avoidance, with strategic exposure. And if we’re in positions of authority, we can give those we oversee genuine permission to try and fail and try again.
It’s natural to avoid discomfort. But if we continually avoid what’s hard, it will feel harder still. If we want to talk about what matters and do it well; exercise moral courage; make a difference in the world – we’ll have to expand our comfort zones, not narrow them. We’ll have to take some risks. But do you know what else is risky? Sitting still.
Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Opinion + Analysis
Politics + Human Rights, Relationships
Who’s your daddy?
Opinion + Analysis
Relationships, Science + Technology
With great power comes great responsibility – but will tech companies accept it?
Explainer
Relationships
Ethics explainer: The principle of charity
Opinion + Analysis
Relationships
We can help older Australians by asking them for help

BY Emma Wilkins
Emma Wilkins is a journalist and freelance writer with a particular interest in exploring meaning and value through the lenses of literature and life. You can find her at: https://emmahwilkins.com/
The ethics of pets: Ownership, abolitionism and the free-roaming model

The ethics of pets: Ownership, abolitionism and the free-roaming model
Opinion + AnalysisSociety + Culture
BY William Salkeld 16 DEC 2024
“If there are no dogs in heaven, I want to go where they go”, said the American performer, Will Rogers.
It is no secret that we are deeply connected to the animals we keep as pets. In Australia, most pet owners would like to think they treat their animals well. 52% of Australians consider their pets to be family members, and in 2021 Australians spent $30.7 billion on food, healthcare, and enrichment for their cats and dogs.
Studying animal ethics, I used to believe that pet ownership was a good thing. Following the arguments of moral philosopher Christine Korsgaard and many others, I saw it as a mutually beneficial situation. We provide our animals with the necessary conditions for a good life, such as food, shelter, healthcare, love and attention. In return, they provide us with companionship and countless other life-affirming benefits.
As long as we can provide our pets with the kind of life that is good for them, then having pets is a good thing, I reasoned. After all, how could I deny that my family’s Golden Retriever, Ruby, lived a charmed life full of play, food, and affection? As I would soon find out, the ethics of pet-keeping is far more complicated than ensuring our furry friends are ‘well taken care of’.
My view about pets was turned upside down after listening to a podcast with legal scholar and animal ethicist Gary Francione. Francione is an abolitionist, meaning that he believes that we should abolish all use and ownership of nonhuman animals, including pets. To understand this view, it is first important to understand the history of domesticated animals and pets.
Domesticated animals are those animals like dogs, cats, or sheep who have adapted or ‘tamed’ because of or by humans. While the jury is still out on how wolves initially interacted with humans to become dogs, humans have subsequently spent millennia forcibly breeding dogs to make them less aggressive, ‘cuter’, or specialised in some activity like cattle herding. Or, in the case of my family’s Golden Retriever, Ruby, it seems she has been bred to chew up sofas and love carrots (don’t ask).
As Francione puts it, we have domesticated dogs and cats into a state of perpetual childhood where they are dependent on us for basic care. Yet, unlike children, cats and dogs don’t grow up and become independent; they remain in a state of dependency for their entire lives.
The animals we keep as pets are a subset of domesticated animals. The concept of pet ownership in Western countries is surprisingly recent, beginning in the Victorian era. Our entitlement to owning pets has not existed throughout history, meaning we should be critical of current ethical attitudes towards pets.
For abolitionists like Francione, the problem of pet ownership is twofold. One, the state of dependency is problematic because we have bred animals to become dependent on us. Imagine if someone raised their child to be completely dependent on them and to never reach true adulthood; we would likely call it emotional abuse. Yet in the case of pets, it is considered ‘proper training’.
Second is the problem of ownership and possession. Animals (including pets) are treated as property in the law. It’s even in the name: ‘pet ownership’. Most of us know that animals are sentient, and perhaps even that they are individuals with rights. However, if this was truly the case, then it would be nonsensical that an animal could be ‘owned’ as property in the same way it is morally outrageous for a human to be owned.
For these two reasons, abolitionists like Francione think that if we could wave a magic wand, we should remove all domesticated animals like cats and dogs who depend on us from existence. While we should adopt and care for the current animals that are already alive, we should also aim to abolish the institution of owning pets.
The abolitionist position is confronting and uncomfortable. I love my dog Ruby, and I was initially defensive against Francione’s argument. Has my family really been doing something wrong by pouring hours of our lives and thousands of our (my parents’) dollars into caring for a dog who we treat as a member of a family? Even if we have not personally been doing anything wrong, the world without cats and dogs that Francione envisages seems like a slightly less happy world.
I was struggling with this dilemma when my partner and I met a cat at a guesthouse in Malaysia’s Cameron Highlands, curled up in a box in the corner of the reception. When we asked the guesthouse owners what his name was, they said that he didn’t have a name yet because he only arrived three weeks earlier.
He, the cat, had picked them. In return for his company, the guesthouse owners gave him as much kibble as he would like and took him to the vet. On the same trip, I also learned that in Türkiye, many people have a ‘tab’ at the vet they use for vagabond cats who choose to live with them. This framework of domesticated animals without ‘owners’ is what I call the ‘free roaming’ model, an idea tentatively welcomed by animal studies scholar Claudia Hirtenfelder that, as Eva Meijer argues, acknowledges the agency of nonhuman animals.
Perhaps a more familiar example of a free-roaming animal is the life of the eponymous dog from the beloved Australian film, Red Dog. In the film, Red Dog is no one’s pet. Rather, he is cared for by the employees of a mine in Western Australia’s Pilbara and eventually decides to live with an American man named John Grant. Red Dog has agency, and [spoiler alert] uses that agency to travel across Australia in search of John after his premature death.
The key difference between free-roaming domesticated animals and pets is that these animals are not owned. Yes, many free-roaming cats and dogs may still depend on some human care, but they have a choice in how and from whom they receive that care.
The free-roaming model is not perfect, however. According to the Humane Society, 25% of outdoor (commonly referred to as ‘feral’) kittens do not survive past 6 months. This is not the case for kittens bred in captivity. And while some cats are lucky, like the one we met in Malaysia, many endure lives of illness and famine. And, as Korsgaard points out, domesticated cats are a predatorial threat to native wildlife, which is why, for example, they are not allowed outside without a leash in the Australian Capital Territory.
For the free-roaming model of animal companionship to work, we would have to make our urban spaces more hospitable to other animals. There would have to be a significant attitude shift in our tolerance of animals in public spaces where we usually don’t see them.
Until that day, perhaps the best thing we can do is adopt pets from shelters like the RSPCA, rather than breeders who perpetuate the institution of animal dependency and pet ownership. However, the free-roaming model gives hope for a post-pet world. As Korsgaard puts it:
“There is something about the naked, unfiltered joy that animals take in little things—a food treat, an uninhibited romp, a patch of sunlight, a belly rub from a friendly human—that reawakens our sense of the all-important thing that we share with them: the sheer joy and terror of conscious existence.”
This article won the 2024 Young Writers’ Competition 18-29 category.
Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Opinion + Analysis
Society + Culture
What are we afraid of? Horror movies and our collective fears
Opinion + Analysis
Society + Culture
True crime media: An ethical dilemma
LISTEN
Relationships, Society + Culture
Little Bad Thing
Opinion + Analysis
Relationships, Society + Culture
Beyond cynicism: The deeper ethical message of Ted Lasso
