
That’s not me: How deepfakes threaten our autonomy
Opinion + AnalysisScience + TechnologySociety + Culture
BY Daniel Finlay 19 AUG 2025
In early 2025, 60 female students from a Melbourne high school had fake, sexually explicit images of themselves shared around their school and community.
Less than a year prior, a teenage boy from another Melbourne high school created and spread fake nude photos of 50 female students and was let off with only a warning.
These fake photos are also known as deepfakes, a type of AI-augmented photo, video or audio that fabricates someone’s image. The harmful uses of this kind of technology are countless as the technology becomes more accessible and more convincing: porn without consent, financial loss through identity fraud, the harm to a political campaign or even democracy through political manipulation.
While these are significant harms, they also already exist without the aid of deepfakes. Deepfakes add something specific to the mix, something that isn’t necessarily being accounted for both in the reaction to and prevention of harm. This technology threatens our sense of autonomy and identity on a scale that’s difficult to match.
An existential threat
Autonomy is our ability to think and act authentically and in our best interests. Imagine a girl growing up with friends and family. As she gets older, she starts to wonder if she’s attracted to women as well as men, but she’s grown up in a very conservative family and around generally conservative people who aren’t approving of same-sex relations. The opinions of her family and friends have always surrounded her, so she’s developed conflicting beliefs and feelings, and her social environment is one where it’s difficult to find anyone to talk to about that conflict.
Many would say that in this situation, the girl’s autonomy is severely diminished because of her upbringing and social environment. She may have the freedom of choice, but her psychology has been shaped by so many external forces that it’s difficult to say she has a comprehensive ability to self-govern in a way that looks after her self-interests.
Deepfakes have the capacity to threaten our autonomy in a more direct way. They can discredit our own perceptions and experiences, making us question our memory and reality. If you’re confronted with a very convincing video of yourself doing something, it can be pretty hard to convince people it never happened – videos are often seen as undeniable evidence. And more frighteningly, it might be hard to convince yourself; maybe you just forgot…
Deepfakes make us fear losing control of who we are, how we’re perceived, what we’re understood to have said, done or felt.
Like a dog seeing itself in the mirror, we are not psychologically equipped to deal with them.
This is especially true when the deepfakes are pornographic, as is the case for the vast majority of deepfakes posted to the internet. Victims of these types of deepfakes are almost exclusively women and many have commented on the depth of the wrongness that’s felt when they’re confronted with these scenes:
“You feel so violated…I was sexually assaulted as a child, and it was the same feeling. Like, where you feel guilty, you feel dirty, you feel like, ‘what just happened?’ And it’s bizarre that it makes that resurface. I genuinely didn’t realise it would.”
Think of the way it feels to be misunderstood, to have your words or actions be completely misinterpreted, maybe having the exact opposite effect you intended. Now multiply that feeling by the possibility that the words and actions were never even your own, and yet are being comprehended as yours by everyone else. That is the helplessness that comes with losing our autonomy.
The courage to change the narrative
Legislation is often seen as the goal for major social issues, a goal that some relationships and sex education experts see as a major problem. The is a slow beast. It was only in 2024 that the first ban on non-consensual visual deepfakes was enacted, and only in 2025 that this ban was extended to the creation, sharing or threatening of any sexually explicit deepfake material.
Advocates like Grace Tame have argued that outlawing the sharing of deepfake pornography isn’t enough: we need to outlaw the tools that create it. But these legal battles are complicated and slow. We need parallel education-focused campaigns to support the legal components.
One of the major underlying problems is a lack of respectful relationships and consent education. Almost 1 in 10 young people don’t think that deepfakes are harmful because they aren’t real and don’t cause physical harm. Perspective-taking skills are sorely needed. The ability to empathise, to fully put yourself in someone else’s shoes and make decisions based on respect for someone’s autonomy is the only thing that can stamp out the prevalence of disrespect and abuse.
On an individual level, making a change means speaking with our friends and family, people we trust or who trust us, about the negative effects of this technology to prevent misuse. That doesn’t mean a lecture, it means being genuinely curious about how the people you know use AI. And it means talking about why things are wrong.
We desperately need a culture, education and community change that puts empathy first. We need a social order that doesn’t just encourage but demands perspective taking, to undergird the slow reform of law. It can’t just be left to advocates to fight against the tide of unregulated technological abuse – we should all find the moral courage to play our role in shifting the dial.


BY Daniel Finlay
Daniel is a philosopher, writer and editor. He works at The Ethics Centre as Youth Engagement Coordinator, supporting and developing the futures of young Australians through exposure to ethics.
Ethics in your inbox.
Get the latest inspiration, intelligence, events & more.
By signing up you agree to our privacy policy
You might be interested in…
Opinion + Analysis
Climate + Environment, Science + Technology
Space: the final ethical frontier
Opinion + Analysis
Business + Leadership, Science + Technology
5 dangerous ideas: Talking dirty politics, disruptive behaviour and death
Opinion + Analysis, READ
Society + Culture
Read me once, shame on you: 10 books, films and podcasts about shame
Opinion + Analysis
Politics + Human Rights, Society + Culture