Executives at Meta Platforms have pressed ahead with plans to implement end-to-end encryption across its messaging services, despite internal warnings about potential risks to child safety. Newly disclosed court documents indicate that senior safety and policy officials raised alarms about the encryption initiative’s capability to detect and report cases of child exploitation to authorities. These revelations come from a lawsuit filed by New Mexico Attorney General Raúl Torrez.
The internal communications, made public on March 15, 2024, reveal a troubling perspective within the company as preparations for the encryption rollout began. In a chat from March 2019, Monika Bickert, head of global policy management, expressed profound concern, stating, “We are about to do a bad thing as a company. This is so irresponsible.” This sentiment reflects a broader unease among executives regarding the implications of default encryption for services associated with Facebook and Instagram.
The lawsuit is notable as it marks the first case against Meta to reach a jury, focusing on allegations that the company enabled online predators to exploit underage users. In conjunction with the New Mexico case, a coalition of over 40 state attorneys general in the United States has initiated lawsuits asserting that Meta’s platforms harm youth mental health. Additionally, several school districts have filed legal actions, and Mark Zuckerberg has testified in connection with another case involving alleged harm to a teenager.
The New Mexico lawsuit specifically accuses Meta of misrepresenting the safety implications of its encryption plans, first announced in 2019. The proposed end-to-end encryption is designed to protect user privacy by ensuring that only the intended recipient can read messages. While this feature is standard in many messaging applications like WhatsApp and iMessage, child safety advocates, including the National Center for Missing and Exploited Children, have cautioned that such encryption could pose significant risks on platforms where children frequently interact with strangers.
Internal Meta communications highlighted similar concerns from within the company. Bickert criticized the encryption initiative, asserting that it would severely hinder the company’s ability to monitor and report potential threats. She remarked, “With end-to-end encryption, there is no way to find the terror attack planning or child exploitation,” emphasizing the difficulties in proactively referring cases to law enforcement.
A briefing document from February 2019 cited in the filings estimated a dramatic reduction in Meta’s reports to the National Center for Missing and Exploited Children. It predicted that reports involving child nudity and sexual exploitation imagery would drop from 18.4 million to 6.4 million if Messenger was encrypted, representing a decrease of approximately 65%. A subsequent update warned that the company would be unable to provide essential data proactively to law enforcement in numerous child exploitation cases.
The documents also indicate that safety officials expressed concerns about children being groomed via Meta’s platforms before being exploited in private messaging channels. In a 2019 email, Antigone Davis, global head of safety, highlighted the risks, stating, “FB allows pedophiles to find each other and kids via social graph with easy transition to Messenger.” She contrasted these risks with those on WhatsApp, emphasizing that its lack of social networking features made it less prone to exploitation.
Responding to inquiries from Reuters, Meta spokesperson Andy Stone stated that the concerns raised by Bickert and Davis informed the development of additional safety features prior to the rollout of encrypted messaging on Facebook and Instagram in 2023. Stone explained, “The concerns raised in 2019 represent the very reason we developed a range of new safety features to help detect and prevent abuse, all designed to work in encrypted chats.”
Under the updated system, while messages are encrypted by default, users can still report troubling conversations to Meta. The company can then review the messages and refer cases to law enforcement when necessary. Meta has also implemented specific protections for underage users, including measures to prevent adults from initiating contact with minors they do not know.
As Meta navigates these complex legal challenges, the implications of its encryption strategy continue to raise critical questions about the balance between user privacy and the safety of vulnerable populations, particularly children. The ongoing scrutiny reflects a growing concern among regulators and advocacy groups regarding the responsibilities of social media companies in protecting their youngest users in an increasingly interconnected digital landscape.
