Share and Follow
“There is a place for AI and legitimate tracking technology in Australia, but there is no place for apps and technologies that are used solely to abuse, humiliate, and harm people, especially our children,” she said.
Independent MP and roundtable co-convener Kate Chaney said: “AI is changing child sexual abuse in Australia, it’s making it easier to generate this material, but it can also be used to help law enforcement.”
The government did not specify a timeframe for when the restrictions will take effect, but has committed to working with the industry to enforce them.
The rise of ‘nudify’ apps
“With just one photo, these apps can nudify the image with the power of AI in seconds. Alarmingly, we have seen these apps used to humiliate, bully, and sexually extort children in the school yard and beyond,” Grant said earlier this year.
“Not only does it normalise and desensitise this behaviour, but also makes it hard for law enforcement to identify actual victims; they’re spending a lot of time trying to distinguish between synthetic material and actual children.”
Working alongside tech platforms
Jennifer Duxbury, director of regulatory affairs, policy and research at DIGI, said: “We support the ecosystem approaches to tackling harm, and look forward to working constructively with the government on the details of the proposal.”

Meta, which owns Facebook and Instagram, said deepfake abuse material violates its rules and they are responding proactively to reports of harm. Source: AAP / Cfoto
The problem isn’t just local.