Share and Follow
A SOCIAL media lawyer has slammed TikTok for allegedly creating a deadly algorithm that addicts children with extreme content.
Matthew Bergman represents the parents of children and teens who commit suicide after being encouraged by sick posts suggested to them by online platforms.
One of the most popular apps in the country, China-owned TikTok took the country by storm just a few years after its inception in 2017.
It’s become a social media staple for its bite-sized content and endless scrolling capabilities that turn hours into a mere moment.
One of the most ingenious features behind the app is the For You page which uses artificial intelligence to suggest videos based on a user’s likes and interaction.
However, according to some studies, the content suggested by the algorithm can get dark within 30 minutes of making an account.
“When you are on social media, you are not a customer. You are the product,” Bergman exclusively told The U.S. Sun.
Bergman explained that free-to-use social media companies get their money from ad revenue, which is primarily gained from user engagement.
Companies like Meta and ByteDance, the owner of TikTok, make more money the more people use their products.
Naturally, this means that bosses need to prioritize getting their audiences to not only log onto their platform but stay for prolonged periods of time.
“They’re not looking at showing you what you want to see. They want to show you what you can’t look away from,” Bergman said.
A report from the Center for Countering Digital Hate found that accounts made by 13-year-olds, the minimum age to use the app, were suggested posts promoting self-harm and disordered eating just 30 minutes after creation.
Bergman fears that teen girls interested in exercise could innocently log into TikTok only to be left with irreparable mental damage.
What starts as recipes and training routines could turn into a “rabbit hole” that promotes “emaciated bodies, guidance on how to live on 500 calories a day, and how to hide your condition,” Bergman claimed.
“That’s just one example of how the design of these platforms, gearing toward maximizing engagement, inexorably leads kids down rabbit holes of affliction,” he said.
Bergman represents the parents of children and teens who commit suicide with encouragement from content circulating on social media sites.
He said he’s currently working on one case that involved a 16-year-old boy who went to TikTok looking for advice on how to mend a broken heart after he experienced his first breakup.
But once the algorithm caught wind of his questions, the boy was allegedly sent material that told him to “shoot himself in the head” to numb the pain, Bergman claims.
It didn’t take long for the boy to heed the sick advice.
“It is absolutely inconceivable that these platforms are allowed to continue in the way that they are,” Bergman said.
Bergman is fighting for the algorithms to be turned off so that users can only find the content they actively look for.
“Show kids what they want to see, not what the platforms determine what is going to maintain their engagement,” he said.
He also suggested bolstering age and identity verification to determine whether children are old enough to use the app or view the content their watching.
This could both stop children from viewing videos that aren’t age-appropriate, and prevent sexual predators from posing as children.
Since its inception, TikTok has faced impassioned criticism by US lawmakers who fear it could be a vessel for a national security breach.
In January 2024, Montana tried to officially TikTok from all personal devices in a radical move that was recently blocked by a federal court.
And many universities in the country have used servers to block the app out of privacy concerns.
Meta may not be safe either as more than three dozen states came together to file a bombshell lawsuit against the company for allegedly using features that hook children to Facebook and Instagram.
But ByteDance has stood its ground throughout the critiques and studies, maintaining it wants the best for its consumers.
In a response to the Center for Countering Digital Hate report, Mahsau Cullinane, a spokeswoman for TikTok, said: “This activity and resulting experience does not reflect genuine behavior or viewing experiences of real people,” per The New York Times.
“We regularly consult with health experts, remove violations of our policies and provide access to supportive resources for anyone in need.”
But Bergman refuses to acknowledge the company’s reasoning, and won’t stop fighting against the “deadly” social media.
“I don’t not thing until the companies are held economically accountable for the carnage these platforms are doing on kids that their behavior is going to change,” he said.
“Currently, the costs of these dangerous social media products are being borne by everyone by the companies.
“They’ve been borne by parents who have to bury their children instead of having their children bury them.
“Right now, they have absolutely no legal accountability for the foreseeable consequences that their products are having on [children’s] mental and physical health.
“Our goal through the litigation is to make them legally accountable and provide economic incentive structure that they need to change their outrageous corporate misconduct.”
The U.S. Sun has approached TikTok for comment.