Technology

Meta Faces Lawsuit Over Child Safety: What Internal Documents Revealed

New Mexico sued Meta, alleging the company knew its platforms enabled child exploitation but did nothing to stop it. Released internal documents show Meta employees flagged serious safety risks involv

Martin HollowayPublished 7d ago5 min readBased on 6 sources
Reading level
Meta Faces Lawsuit Over Child Safety: What Internal Documents Revealed

Meta Faces Lawsuit Over Child Safety: What Internal Documents Revealed

New Mexico sued Meta Platforms on December 5, 2023, claiming the company deliberately built Facebook, Instagram, and WhatsApp in ways that put children at risk. The lawsuit also alleges Meta misled users about how safe these platforms actually are.

This marks the first time a state has brought a major lawsuit focused specifically on how a social media platform harms children. It followed an undercover investigation by New Mexico's attorney general into how Meta handles content moderation and designs its features.

What the Internal Documents Showed

The most important evidence came when court papers were released on January 17, 2024. They contained private communications from Meta employees and leaders from 2020 and 2021 showing the company knew about serious safety problems involving children.

In these documents, Meta employees discussed how adults could easily contact children through the platform's messaging systems. One feature called "people you may know" — which suggests you connect with other users based on various information about you — was flagged internally as a way strangers could reach out to minors.

The documents also show that Meta knew sexual content involving minors was being shared on Instagram. In one case from 2020, an Apple executive's 12-year-old child was approached inappropriately on the platform. Internal emails described the problem in blunt terms, noting it was the kind of issue that could anger Apple enough to remove Meta from the App Store.

This reveals that Meta understood both how serious the problem was and what it might cost the company if they did not fix it.

How Algorithms Drive Engagement

New Mexico's case argues that Meta designed its recommendation systems specifically to keep young users on the platform longer and more often. Recommendation systems are the algorithms that decide what content you see and who the platform suggests you connect with.

The lawsuit says Meta continued using these engagement-focused systems even after internal teams documented that they could lead to exploitation of minors. In other words, the company may have knowingly chosen to prioritize keeping users engaged over keeping them safe.

Think of it this way: imagine a store that knows a particular layout brings in more shoppers but also makes it easier for someone with bad intentions to target children in that store. This lawsuit is arguing Meta did the digital equivalent of that.

Why This Lawsuit Matters Differently

This is not the first enforcement action against Meta. In October 2023, attorneys general from 33 states filed a separate lawsuit alleging Meta designed Instagram and Facebook to be addictive to children. However, New Mexico's case focuses specifically on sexual exploitation rather than broader concerns about addiction and mental health.

New Mexico took a strategic approach by focusing on consumer protection laws — essentially arguing that Meta deceived consumers and parents about safety — rather than on the more legally complicated question of whether social media platforms should be held liable for all harmful content. This approach may be easier to win in court.

The broader regulatory pressure on Meta started in November 2021, when New Mexico joined other states investigating how Instagram affects young users. Earlier, in May 2021, a bipartisan group of 44 attorneys general had urged Meta to scrap plans for an Instagram app designed specifically for children under 13, citing exploitation and developmental risks.

The Bigger Picture

I have covered platform regulation battles since the 1990s, and we have seen this pattern before: companies promise to police themselves, then internal documents emerge showing they knew about problems all along, and finally governments coordinate enforcement actions. The Meta case follows this familiar path, but the focus on child safety gives it particular urgency and weight.

What happens next will likely shape how other tech companies design their platforms. If New Mexico wins, other states may file similar lawsuits. Courts may require tech companies to change how their algorithms work, particularly features that affect children. The process of discovering internal documents — which is already happening in this case — could become standard practice in state investigations of social media companies.

The case represents a test of whether individual states can do what Congress and federal agencies have largely failed to do: effectively regulate how social media platforms are built and how they treat young users.

What This Means for Platform Design

One real challenge for Meta and other social media companies is that manually reviewing every connection and piece of content across billions of users is not technically possible. However, the documents in this lawsuit suggest Meta had the ability to spot and fix specific dangerous patterns — such as how the "people you may know" feature connected minors to strangers — but chose not to prioritize that work.

The case raises hard questions about what tech companies owe users, especially vulnerable ones. It also highlights a central tension: business models built on maximizing how much time people spend on a platform can work against that platform's ability to keep users safe.

Whether states can use consumer protection law to force changes in how platforms are designed remains to be seen. But this lawsuit signals that the era of largely unregulated social media may be coming to an end.

Meta Faces Lawsuit Over Child Safety: What Internal Documents Revealed | The Brief