Why Don’t We Treat Non-Consensual Deepfake Porn Of Women As Rape?

A man sitting at a desk, staring at a bright computer screen with the lights off around him. headline of the story: Why Don't We Treat Non-Consensual Deepfake Porn Of Women As Rape?

With the rise of AI in our every day lives (hello ChatGPT), we’re facing all kinds of complicated legal, ethical and existential questions on what it means to exist alongside such advanced technology. But a topic I don’t see discussed enough is the growing use of AI to create non-consensual deepfake porn and its impact on how we define sexual violence.

Before we go into this any further, let me tell you a story: On January 31, Twitch streamer Brandon Ewing, better known as Atrioc, issued a tearful apology to the streaming community after he was caught watching deepfake porn of his fellow female streamers while his wife was out of town.

He outed himself when he tabbed out of a game while streaming, accidentally giving viewers a glimpse of an open window which featured porn of multiple popular female streamers that was made without their consent.

It took little time for Atrioc to hop on live and turn on the water works, feverishly defending his actions as an “embarrassing” one-off occurrence that only happened because he saw an ad, clicked on it and fell into a “rabbit hole”. Except, as other streamers pointed out and he admitted, Atrioc paid to view the deepfake porn which could only be accessed as part of a subscription.

QTCinderella, one of the streamers whose likeness was used on the website, posted an emotional stream not long after where she revealed people were spamming her with pornographic deepfakes of herself.

“I wanted to go live because this is what pain looks like,” she said.

“Fuck the constant exploitation and objectification of women, it’s exhausting. Fuck Atrioc for showing it to thousands of people. Fuck the people DMing me pictures of myself from that website. Fuck you all.

“If you are able to look at women who are not selling themselves or benefiting off of being seen sexually… you are the problem.”

Misogynistic Twitch users were quick to claim the women in question had “asked for it” by simply existing. Others said it was “just porn” and to “get over it” since Atrioc hadn’t actually touched these women and therefore not assaulted them.

Maya Higa, one of the women who Atrioc was caught watching deepfake porn of, shared a harrowing statement where she said she felt like she had been sexually assaulted.

“In 2018, I was inebriated at a party and I was used for a man’s sexual gratification without my consent,” she wrote.

“Today, I have been used by hundreds of men for sexual gratification without my consent. The world calls my 2018 expereince rape. The world is debating over the validity of my experience today.”

Her pained words have opened up a conversation that I think is long overdue: should we consider deepfake porn made and distributed without its subjects’ consent as sexual assault?

“It’s absolutely sexual violence,” Dr Emma Jane, one of the world’s leading academic experts on digital misogyny and gender-based violence, told me over the phone.

She calls acts like creating non-consensual deepfake porn “technology-facilitated gender-based violence”, stressing that just because something happens “online”, doesn’t mean it’s “not real”.

“I don’t agree that there’s this stark dichotomy. The internet and our daily lives are completely enmeshed now,” she told PEDESTRIAN.TV.

“These types of technologies (VR and AI) are designed to be immersive and to really trick your brain and your body into thinking that you are physically engaged in what you are doing.

“Like, people who walk along a plank in VR report getting really sweaty and fearful even though they know the plank they are walking on is just virtual and not real.

“So I do worry about about the capacity for this type of sexual violence in these kinds of emerging tech spaces. It’s really disappointing to see the same defences being wheeled out that there’s a distinction to be made between, you know, ‘real life’ and ‘online life’, or in this case, porn.”

Dr Jane told me about the first “cyber rape”, which was documented in 1993 by a journalist and happened on LambdaMoo, an online community at a time where the internet was still in its infant stages.

“It was like a multiplayer online game with quite a notorious incident, and involved really only text. There was all this debate about whether virtual rape is, you know, real rape,” she recalled.

“Fast forward 30 years and we’ve got Horizon Worlds, Facebook’s Metaverse open for, you know, barely seconds before the first gang rape is reported. And exactly the same response: ‘it’s not real’, ‘they don’t have legs, how could it be rape?’, ‘you can’t tell what’s real and what’s not real’.

“And really, these are not that far from the kind of responses that, unfortunately, a lot of people still give women ‘IRL’ when they complain about sexual assault. You know, ‘it was just a slap on the bum’, ‘you were asking for it’. The downplaying of sexual violence doesn’t just happen in the internet domain.

“The fact that rapes are starting to be reported in the Metaverse and in virtual reality situations, I think they are really closing the gap even further between this idea of ‘online’ and ‘offline’.”

So if online-facilitated sexual violence can be just as traumatic for some women as offline sexual violence, do we need to update our current sexual assault laws to account for this?

While Dr Kath Albury, who is leading the Digital and data literacies for sexual health policy and practice research project at Swinburn University of Technology, agrees online sexual violence is deeply traumatic, she reckons a new sexual assault law is probably not the best way to handle it.

“I would kind of rewind a little bit,” she tells PEDESTRIAN.TV.

“There’s really good documentation on the ways that non-consensual image sharing and or just knowing that a non-consensual image exists, is traumatising to victims.

“Demanding that it be legally framed as sexual assault I think opens up a whole lot of issues.

“One because, you know, we know really clearly that most victims of sexual assault are re-traumatised through the court process and have a really high burden of proof and a whole lot of other things and most perpetrators are never actually prosecuted. So saying we need another law is asking for a really broken system to do another broken thing.

“And it becomes a local law. It’s like, ‘oh, well, the host of the server that hosted the porn was actually in Germany, it wasn’t in Australia. So do we have jurisdiction over blah, blah.’ It becomes a lot of money being generated for lawyers and not a justice or survivor focused response.”

Instead, Dr Albury reckons the already emerging laws around image-based abuse will be more helpful in getting victims of deepfake porn justice.

Image-based abuse is a breach of the Online Safety Act 2021. It’s also a criminal offence in Australia under Commonwealth, state and territory laws. The latter vary between jurisdictions but some can certainly be interpreted to include deepfakes.

“I would put this in a category that does exist and is pretty well documented, which is image-based abuse,” Dr Albury said.

“It’s a umbrella term that covers things like sexting shared without consent, images made with consent [which] are then shared without consent. It covers things like upskirt photos, even things like blackmail where you would say ‘if you break up with me, I’ll release the nudes, that we’ve shared consensually during the relationship, et cetera, et cetera’.

“The utility of that term is that it steps away from the prior term revenge porn, which one, you know, was relating non-consensual activity to the consenting products of the sex industry basically. And it actually frames it distinctly as abuse rather than having the term revenge in there, which somehow makes it look like the victim did something to deserve it. I find that term much more useful.”

Dr Albury also noted we should have more critical conversations about the role of social media platforms in terms of regulating non-consensual sexually explicit content and protecting users from harm.

“So with Twitch, for example, I don’t know, do they have a code of conduct around non-consensual streaming? And the use of AI in generating content without the subject of the contents’ consent? Can you stream about your criminal activity, even though you didn’t do the crime on Twitch? What are the expectations around conduct towards other members of the community?”

Twitch does in fact have community guidelines against non-consensual sexual activities, sexual exploitation and sexual harassment, which can result in indefinite suspension of a user’s account. Interestingly, it appears Atrioc was not suspended for his deepfake porn viewing, which you would think is covered by these guidelines.

PEDESTRIAN.TV has reached out to Twitch for comment. We’ve also asked them if/how the platform provided any support for the female streamers who were harassed and violated on it.

eSafety Commissioner Julie Inman Grant also told PEDESTRIAN.TV platforms have a responsibility to get deepfakes off their websites.

“Innovations to help identify, detect and confirm deepfakes are advancing and technology companies have a responsibility to incorporate these into their platforms and services. We call this #SafetybyDesign,” she said in a statement.

“Any Australian whose images or videos have been altered to appear intimate and are published online without consent can contact eSafety for help to have them removed.”

Recording, sharing, or threatening to share an intimate image without consent is a criminal offence across much of Australia. 

If you’d like to report image based abuse to police or get help removing an intimate image from social media, reach out to the Australian eSafety Commissioner here.

If you’d like to speak to someone about sexual violence, please call the 1800 Respect hotline on 1800 737 732 or chat online.

Under 25? You can reach Kids Helpline at 1800 55 1800 or chat online.

More Stuff From PEDESTRIAN.TV