TikTok, the popular short-form video sharing app, is facing multiple lawsuits alleging that the company knowingly designed features that are harmful to its young users. Internal documents and communications exposed in a lawsuit filed by the state of Kentucky reveal that TikTok was aware of the negative impact its platform had on young users, despite publicly touting tools to limit kids’ time on the site. The lawsuit, filed alongside complaints from attorneys general in a dozen states and the District of Columbia, also sheds light on how TikTok tracked the time young users spent on the platform and used that information to drive user engagement.

The complaint also alleges that TikTok has prioritized “beautiful people” on its platform and used content moderation metrics that are “largely misleading.” TikTok’s algorithm was reportedly changed to show fewer “not attractive subjects” in the main feed, potentially perpetuating a narrow beauty norm among its users. The lawsuit also criticizes TikTok’s content moderation practices, noting significant “leakage” rates of content that violates community guidelines but is not removed. This includes content that normalizes pedophilia and glorifies minor sexual assault, with the company accused of misleading the public about its moderation efforts.

TikTok introduced time management tools, such as a 60-minute daily screen time limit for minors, in an effort to help teens manage their time on the platform. However, the lawsuit argues that these features were more for public relations than effective time management. TikTok reportedly measured the success of the time limit feature based on public trust and media coverage, rather than reducing teen screen time. An experiment found that the time-limit prompts only reduced teens’ time on the app by a minute and a half, leading to allegations that the feature’s ineffectiveness was intentional.

The lawsuit also exposes internal discussions within TikTok about the high levels of compulsive usage on the platform, with reports of teens watching content excessively due to the algorithm being “really good.” The company’s internal report highlighted the rampant compulsive usage on TikTok, raising concerns about its impact on users’ daily activities such as sleep, eating, and social interactions. Despite efforts to promote safety features like default screentime limits and family pairing, TikTok’s alleged prioritization of engagement metrics over user well-being has come under scrutiny.

TikTok has been accused of not disclosing significant leakage rates of content that violates community guidelines, including content that normalizes pedophilia and glorifies minor sexual assault. The company’s alleged focus on promoting a narrow beauty norm and prioritizing engagement metrics over user safety has raised concerns about its impact on young users. TikTok’s ongoing legal battles with multiple states and the Department of Justice highlight the challenges social media platforms face in balancing user engagement with responsible content moderation practices. The outcome of these lawsuits could have significant implications for the future of TikTok and its approach to addressing the concerns raised about its impact on young users.

Share.
Exit mobile version