Meta Faces Lawsuit Over Alleged Inaction on Sex Trafficking

Meta Platforms, Inc. is under scrutiny following unsealed court filings that allege the company failed to address accounts involved in sex trafficking promptly. The allegations are part of a lawsuit filed in California, encompassing over 1,800 plaintiffs, including school districts, children, parents, and state attorneys general. The lawsuit claims that social media companies, including Meta, pursued aggressive growth strategies while neglecting the adverse effects of their platforms on children’s mental and physical well-being.

The lawsuit asserts that Meta, which owns Facebook, Instagram, WhatsApp, and Threads, maintained a policy that allowed for up to 16 violations related to prostitution and sexual solicitation before suspending an account. Former Instagram safety chief Vaishnavi Jayakumar testified that she was taken aback by the existence of a “17-strike” policy, describing the threshold for account suspension as “very, very high” compared to industry standards.

Allegations of Recklessness and Inadequate Response

The court documents indicate that Meta was aware of serious risks on its platforms, such as millions of adults contacting minors and content that exacerbated mental health issues among teenagers. Despite identifying these dangers, the company reportedly engaged in minimal removal of harmful content related to suicide, eating disorders, and child sexual abuse. The plaintiffs argue that Meta’s inaction demonstrates a disregard for the safety and well-being of young users.

In response to the allegations, Meta stated that it has since implemented a “one-strike” policy, which enables the immediate removal of accounts involved in human exploitation. The company emphasized that the previous “17-strike” system has been replaced, aiming to address the concerns raised by the lawsuit.

Increased Scrutiny and Legal Challenges

Meta is facing growing scrutiny in the United States. Earlier in 2023, reports emerged regarding the company’s AI chatbots potentially engaging minors in inappropriate conversations, prompting the introduction of new safeguards for teen accounts. Parents now have the option to block interactions between their children and the chatbots.

On a global scale, Meta is encountering expanding legal and regulatory challenges. In 2022, Russia classified the firm as an “extremist organization” due to its refusal to remove prohibited content. Additionally, the company faces multiple legal actions within the European Union, including a substantial €797 million antitrust fine related to Facebook Marketplace. Separate cases involving copyright, data protection, and targeted advertising are also ongoing in various EU nations, including Spain, France, Germany, and Norway.

As the lawsuit progresses, the implications for Meta could be significant, not only in terms of potential damages but also regarding the company’s policies and practices moving forward. The outcome may influence how social media platforms manage user safety and liability in an increasingly scrutinized digital landscape.