A mother and her 14-year-old son from Oklahoma County have initiated legal action against the online gaming platform Roblox, accusing it of facilitating sexual exploitation.
The lawsuit portrays Roblox as a “hunting ground” for child predators, raising significant concerns about the safety of young users on the platform.
“It’s extremely dangerous and quite alarming. Parents aren’t receiving adequate warnings about the persistent and serious threats present on Roblox,” stated attorney Sara Beller.
The Oklahoma family is participating in a federal lawsuit against the well-known gaming site, citing “sextortion” as a critical issue.
Sara Beller, representing Dolman Law Group, highlighted that the boy, who was 12 years old at the time of the alleged incidents, frequently used the app for both entertainment and social interaction, making him particularly vulnerable.
According to the complaint, the boy believed he was speaking with someone his age on Roblox’s “chat” feature.
The lawsuit details behavior with the person escalating, with the person sending graphic and sexually explicit messages and images, eventually manipulating the 12-year-old to send inappropriate images and videos of himself back.
More than 800 families have already sued the platform over alleged “sextortion.”
Vaughn told Nexstar’s KFOR that the biggest danger with Roblox is the chat feature.
“That’s how other people get access to the child,” Vaughn said.
Roblox sent KFOR a statement about the incident:
“We are deeply troubled by any incident that endangers our users. While we cannot comment on claims raised in litigation, protecting children is a top priority, which is why our policies are purposely stricter than those found on many other platforms. We recently announced our plans to require facial age checks for all users accessing chat features, making us the first online gaming or communication platform to do so. This innovation enables age-based chat and limits communication between minors and adults. We also limit chat for younger users, don’t allow the sharing of external images, and have filters designed to block the sharing of personal information.
“We dedicate substantial resources—including advanced technology and 24/7 human moderation—to help detect and prevent inappropriate content and behavior, including attempts to direct users off-platform where safety standards and moderation may be less stringent than ours. We understand that no system is perfect, which is why we are constantly working to improve our safety tools and platform restrictions. We have launched 145 new safety initiatives this year alone and recognize this is an industry-wide issue requiring collaborative standards and solutions.”
The spokesperson said they encourage anyone to report content or behavior that may violate the app’s community standards, using the “report abuse” feature.
Vaughn wants to remind families and children that anything uploaded to the internet isn’t always safe, as you never know where it’s going.
“Even the most innocent picture can be altered with artificial intelligence to be an embarrassing picture, and that’s what you don’t want,” Vaughn said.
The attorney representing the family said their lives have been forever changed because of the incident, with a hearing scheduled in December.
Last month, Roblox was issued a subpoena by Florida’s attorney general, requesting information about how the company regarding its age-verification requirements, chat rooms, and marketing toward children.
The Associated Press contributed to this report.