Executives at Meta Platforms have pressed ahead with plans to implement default end-to-end encryption across its messaging services, despite internal warnings regarding the potential dangers this change poses to child safety. Newly disclosed court filings reveal that senior officials expressed grave concerns about the implications for detecting and reporting cases of child exploitation. These findings emerged from a lawsuit filed by New Mexico Attorney General Raúl Torrez.
Internal documents, made public on March 15, 2024, include emails and messages that highlight the apprehensions of Meta’s safety and policy teams as the company prepared to announce its encryption strategy. In a 2019 chat, senior executive Monika Bickert remarked, “We are about to do a bad thing as a company. This is so irresponsible,” as Mark Zuckerberg geared up to publicize the encryption initiative.
The lawsuit alleges that Meta’s platforms have inadvertently provided online predators with unfettered access to minors, facilitating connections that can lead to serious criminal activities, including human trafficking. This case, which began trial in March 2024, is the first of its kind to reach a jury against the tech giant as it faces mounting legal scrutiny globally concerning the safety of young users.
Concerns about the encryption plan have been echoed by various governmental and legal entities. A coalition of over 40 state attorneys general in the United States is pursuing lawsuits against Meta, asserting that its products negatively impact youth mental health. In addition, several school districts have launched legal actions, and Zuckerberg has recently testified in a separate case concerning a teenager allegedly harmed by the company’s offerings.
The New Mexico lawsuit specifically accuses Meta of misrepresenting the safety risks associated with its encryption plans, which were first announced in 2019 and later extended to include direct messages on Instagram. End-to-end encryption allows messages to be transmitted in a format that only the recipient’s device can decode. While this feature is prevalent in many messaging platforms, such as iMessage and WhatsApp, child safety advocates, including the National Center for Missing and Exploited Children, warn that it can exacerbate risks within large social networks where children are vulnerable to being targeted by strangers.
Internal communications from Meta illustrate that some safety officials shared similar concerns. Bickert criticized the company for making “gross misstatements of our ability to conduct safety operations” while promoting the encryption initiative. She stated, “With end-to-end encryption, there is no way to find the terror attack planning or child exploitation” and subsequently refer such cases to law enforcement.
A briefing document from February 2019 indicated that if Messenger had been encrypted, Meta’s reports to the National Center for Missing and Exploited Children regarding child nudity and sexual exploitation imagery could have plummeted from 18.4 million to 6.4 million, a reduction of approximately 65%. Another update warned that the company would have been “unable to provide data proactively to law enforcement in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases, and 9 threatened school shootings.”
Antigone Davis, another Meta safety official, expressed in a 2019 email that the company’s social networks could facilitate the grooming of children before they are exploited through private messaging channels. She highlighted the risks associated with Messenger, noting, “FB allows pedophiles to find each other and kids via social graph with easy transition to Messenger.” Davis compared these risks to WhatsApp, stating, “WA does not make it easy to make social connections,” implying that encrypting Messenger would significantly worsen the situation.
In response to the concerns raised, Meta spokesperson Andy Stone emphasized that the issues identified by Bickert and Davis prompted the development of additional safety features before the encryption rollout in 2023. “The concerns raised in 2019 represent the very reason we developed a range of new safety features to help detect and prevent abuse, all designed to work in encrypted chats,” he stated.
Under the updated system, while messages are encrypted by default, users retain the ability to report concerning conversations to Meta. The company can then review these messages and refer cases to law enforcement if necessary. Furthermore, Meta has implemented protections aimed specifically at underage users, including measures to prevent adults from initiating contact with minors they do not know.
The ongoing revelations about Meta’s encryption plans underscore the tension between privacy features and child safety measures, as the company navigates increasing scrutiny from regulators and advocacy groups.
