Face value: the deepfake dilemma

New legislation has been enacted in NSW cracking down on sexually explicit deepfake content, sanctioning both its creation and distribution.  
AI as visualised in Google’s DeepMind project (image: Wes Cockx for Google DeepMind via Unsplash)

New legislation has been enacted in NSW cracking down on sexually explicit deepfake content, sanctioning both its creation and distribution.  

This comes after the eSafety Commissioner has reported that AI-generated abuse has steadily increased over the past 18 months.   

Deepfakes describe a video which has been digitally altered using deep learning A-I tools to create a highly realistic depiction of them saying or doing something they did not. 

First seen as far back as 2017 by Reddit user u/deepfakes, who coined the term, the technology has now evolved so that deepfakes can be generated in a matter of seconds.  

Only a small amount of audio or visual data is needed, which increases the risk of abusive use as it becomes more user accessible. 

Despite its evolving nature, Full Stop Australia CEO Karen Bevan says that there’s nothing new about deepfake technology and that it is being used to continue to be “weaponised” against women.   

“The attempt to intimidate, [to] control, to make people feel shamed and stigmatised, [to] cause people distress,” says Ms Bevan.   

The eSafety Commissioner reported last year that 99 per cent of pornographic deepfakes online are of women and girls. 

What is being done to help?  

Ms Bevan describes eSafety’s work for image-based abuse as “world-leading”, with their explicit image takedown and reporting service having expanded to cover deepfake material.  

We’re very lucky that we have those services within Australia,” says Dr Giselle Woodley. 

Dr Woodley is a sexologist and lecturer/research fellow within ECU’s School of Arts and Humanities.  

She is concerned that some of the most common victim groups of image-based abuse aren’t made aware of the resources available. 

Dr Giselle Woodley says schools need to better address sexual abuse (image: Edith Cowan University

Dr Woodley believes that while changing legislation is a first step, bettering the education of young people is a critical point in combatting this form of abuse. 

“They’re saying that … a lot of what they learn about sex and relationships, … they’d learn through social media.” 

Reflecting on the upcoming social media ban, Dr Woodley says it may pose more problems than solutions regarding information around sex and relationships.  

“If we’re … going to prevent them from having access … to further information and support, we’d bloody well better be supporting them in other ways.”  

As well as making changes to the curriculum, Ms Bevan says more conversations about sexual violence need to be had as a community.  

She believes that talking about how AI is being misused online will allow those who have been affected to speak out and seek help.  

The future 

With a note of irony, Dr Woodley suggests a hypothetical silver lining to the increase in deepfake content. 

“[Youth focus groups say] the leaking of nudes is so commonplace that … essentially the teens are saying ‘big whoop’ 

 I would imagine now when a nude gets released, they say, ‘it’s just a deepfake’.”

As technology becomes more widespread and cases continue to rise, it remains a very real threat, Ms Bevan explains. 

“People are often left feeling deeply traumatised, … violated and also frightened of what might happen next. 

“We know that the scale-up of harm is real, and it is critical that our legal systems attempt to grapple with that.” 

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use