Share and Follow

Major players in the video gaming industry may face fines approaching $50 million if they do not comply with new directives aimed at protecting children from grooming and radicalization. This significant move underscores the pressing need to address these serious online threats.
Australia’s eSafety Commissioner has issued a formal notice to popular gaming platforms including Roblox, Fortnite, Minecraft, and Steam. These companies are now required to demonstrate their strategies for identifying, preventing, and responding to severe online harms.
The necessity of these measures arises from growing worries that such platforms are being exploited by sexual predators to initiate contact with children or by extremists seeking to disseminate violent propaganda and cultivate radical ideologies among young users.
Lauded as a “global precedent,” this initiative has garnered support from experts like Australian Catholic University professor Niusha Shafiabady. She commends it as a crucial advancement in the effort to safeguard the younger generation. “With a game as vast as Roblox, complete control is unfeasible,” she stated to AAP on Wednesday, “but any effort to mitigate these existing risks is certainly better than none.”
“In the scale that a game like Roblox has, controlling everything is impossible … but any ways to mitigate these risks that exist is better than nothing,” she told AAP on Wednesday.
Roblox and Fortnite are among the most popular games for younger children, but both have been embroiled in various controversies.
Neo-Nazi, anti-Semitic and violent content has been found on Fortnite, including a map based on a concentration camp where 100,000 people were killed during World War II.
Terrorist attacks and mass shootings have reportedly been recreated on Roblox.
University of Sydney researcher Milica Stilinovic described the commission’s attempts as “essentially playing whack-a-mole” because the internet is fluid, but said compelling platforms to be forthcoming about the user experience was needed.
“Seeking transparency from these particular platforms is crucial because they’re not coming to the table in terms of how the plumbing works on the back end,” Dr Stilinovic said.
The video game platforms face fines of up to $825,000 per day should they fail to comply with the commissioner’s notice.
“Gaming platforms are amongst the online spaces most heavily used by Australian children, functioning not only as places to play, but also as places to socialise and communicate,” eSafety Commissioner Julie Inman Grant said.
“Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms.”
About nine in 10 Australian children between the ages of eight and 17 have played games online.
Online services are required to implement processes to protect Australians from illegal and restricted material, including measures to address risks of grooming.
Roblox has pledged to make private by default those accounts belonging to children under 16, and will introduce tools to prevent adults from contacting them without parental consent.
Fortnite developer Epic Games uses chat filters to remove hate speech and has implemented systems to automatically report potentially harmful chat interactions with those under 18.
Players under 16 are not allowed to use text or voice chat until a parent consents.
For the latest from SBS News, download our app and subscribe to our newsletter.