Share and Follow
Authorities are currently probing allegations that explicit deepfake images featuring the faces of female high school students in Sydney have surfaced.
The New South Wales Police have verified that officers in Ryde, located north-west of the city’s central business district, are actively looking into these claims.
“Officers from the Ryde Police Area Command have initiated an investigation,” a police spokesperson stated.
“Investigations are ongoing, and no additional information is available at this moment.”
The Department of Education has expressed its cooperation with the police and has pledged to take necessary actions if any student is identified as being involved in the creation of these images.
The Department of Education said it is working closely with police and will take action if any student is found to be behind the images.
”Deepfakes present significant new risks to the wellbeing and privacy of students,” a department spokesperson said.
“The school is working closely with police on this matter.
“If any student is found to have engaged in this behaviour, the school will be taking strong disciplinary action.”
Acting Education Minister Courtney Houssos said the report was “deeply concerning”.
“These are deeply concerning reports, and police are investigating this matter as is appropriate. I expect those responsible to face the appropriate consequences,” she said.
“I understand the school has support in place for the students affected as well as support for the broader school community.
“As this is an active police investigation, it would be inappropriate to comment further.”
NSW tightens laws around deepfakes
Last month, NSW passed legislation to criminalise using artificial intelligence to create intimate images without consent.
It is now punishable by up to three years in jail.
Sharing those images, even if a person has not created them, can also land them in jail for the same amount of time.
The amendments also cracked down on the creation, recording and sharing of sexually explicit audio that is either real or designed to sound like a real person.
Attorney-General Michael Daley said the laws would better protect people, particularly young women, from image-based abuse.
“This bill closes a gap in NSW legislation that leaves women vulnerable to AI-generated sexual exploitation,” he said at the time.
“We are ensuring that anyone who seeks to humiliate, intimidate or degrade someone using AI can be prosecuted.”
‘Current crisis’ for schools and students
Deepfakes are digitally altered photos, videos or voice recordings of someone that have been edited to falsely depict them.
Tools to create deepfakes can be exploited to create non-consensual and fake explicit images of a person, with women and girls the most likely targets.
The eSafety commissioner, Julie Inman Grant, in June said deepfakes are a “current crisis affecting school communities across Australia”.
“Students have found their image represented in fake nude photos or videos,” the commissioner wrote at the time.
“Others have received AI-generated explicit content of their peers. Entire school communities have been thrown into turmoil – with families, educators, and students unsure how to respond.”
Inman Grant added that deepfakes are “increasingly in use” among young people.
Readers seeking support can contact Lifeline on 13 11 14 or beyond blue on 1300 22 4636.
Kids Helpline 1800 55 1800.