Child on computer
Share and Follow

The use of AI to create child abuse material is surging, prompting a warning to parents about the dangers of its use.

The Australian Centre to Counter Child Exploitation, led by the Australian Federal Police, has witnessed an increase in illegal AI-generated material in the past year.

Part of that increase has been a higher incidence of students creating images such as deepfakes for numerous reasons, including to harass or embarrass classmates.

Child on computer
The AFP has warned of a rise in AI-generated child abuse material. (Getty Images/iStockphoto)

Last year, one Australian man was jailed for possession of AI-generated child abuse material, and another was jailed for using AI to produce child abuse images.

AFP Commander Helen Schneider urged parents to have open and non-judgemental conversations with their children, and said many young people might not be aware that using AI to create material featuring their classmates could be criminal.

“Children and young people are curious by nature, however, anything that depicts the abuse of someone under the age of 18 – whether that’s videos, images, drawings or stories – is child abuse material, irrespective of whether it is ‘real’ or not,” Schneider said.

“The AFP encourages all parents and guardians to have open and honest conversations with their child on this topic, particularly as AI technology continues to become increasingly accessible and integrated into platforms and products.”

Some of the child abuse material is being created by young people, sometimes to target classmates and peers. (Getty Images/iStockphoto)
She said an AFP-led education program, ThinkUKnow, offered free resources in this area for parents and carers.

Research conducted by the ACCCE in 2020 revealed only about half of parents talked to their children about online safety.

“These conversations can include how they interact with technology, what to do if they are exposed to child abuse material, bolstering privacy settings on online accounts, and declining unknown friend or follower requests,” Schneider said.

In the 2023-24 financial year, ThinkUKnow delivered 2218 presentations about online child sexual exploitation to 202,905 students across Australia.

The program, run by the AFP, state and territory police and industry volunteers, also delivered 317 presentations to more than 21,500 parents, carers and teachers during the same period.

People seeking support, resources or ways to report child abuse material should visit the ACCCE website.
Support is available from the National Sexual Assault, Domestic and Family Violence Counselling Service at 1800RESPECT (1800 737 732).
Share and Follow
You May Also Like
Mini tornadoes on Mars reveal surprising force of red planet's winds

Martian Mini Tornadoes Unveil Unexpected Strength of the Planet’s Winds

Twenty years’ worth of imagery captured by two orbiters circling Mars has…
Dolly Parton

Dolly Parton Shines in Glamorous Style, Dismisses Health Rumors

Dolly Parton wants everyone know she’s fine and “not dying.” An odd…
Boss of US's biggest bank flags major stock market warning

CEO of America’s Largest Bank Issues Significant Stock Market Alert

Enthusiasm about artificial intelligence has propelled markets to record highs this year.…

Is it possible for Donald Trump to be awarded the Nobel Peace Prize in 2025?

United States President Donald Trump has made no secret of the fact…
Putin admits Russian missiles downed passenger jet

Putin Acknowledges Russian Involvement in Passenger Jet Downing

President Vladimir Putin has admitted that Russia’s air defence were to blame…

Department Reports Optus Sent Emergency Outage Notification to Incorrect Email Address

Optus sent emails notifying the federal communications department about a deadly triple-zero…
Man arrested after woman's death at Melbourne home

Man Detained Following Woman’s Death at Melbourne Residence

A man has been arrested after the suspicious death of a woman…

Daughter Seeks Justice as New Evidence Emerges in Her Mother’s 22-Year-Old Murder Case

Warning: this article contains the name and image of an Aboriginal person…