Technology

Europe's Regulators Say Meta and TikTok Aren't Being Transparent Enough. Here's What That Means.

European regulators have found that Meta and TikTok violated transparency rules under a new set of laws called the Digital Services Act. The violations include failing to share data with researchers,

Martin HollowayPublished 2w ago4 min readBased on 7 sources
Reading level
Europe's Regulators Say Meta and TikTok Aren't Being Transparent Enough. Here's What That Means.

Europe's Regulators Say Meta and TikTok Aren't Being Transparent Enough. Here's What That Means.

On October 24, 2025, European regulators said they found that Meta (which owns Facebook and Instagram) and TikTok have broken transparency rules. The accusation comes under something called the Digital Services Act, a new set of EU rules designed to control how large online platforms operate. This is the latest step in Europe's effort to hold big tech companies accountable.

The Commission's findings list four specific problems with Meta: it hasn't been transparent enough in its reporting, it hasn't given researchers proper access to data, its systems for reporting illegal content don't work well, and its process for appealing content removals is broken.

Why These Rules Matter

Europe launched formal investigations into Facebook and Instagram starting in April 2024. Officials wanted to understand how Meta handles political content, puts rules on advertising, removes illegal posts, and lets researchers study how the platform works. A few weeks later, in May, they opened another investigation, this one focused on whether Meta is properly protecting children.

The transparency rules are designed to let people outside the platforms—like researchers and journalists—see how these companies actually work and what happens on their sites. Right now, according to regulators, Meta isn't letting researchers access enough information to study the platform's effects on society. That matters because researchers need data to see if there are real problems that regulators should know about.

The other problems regulators found are more straightforward: when users report illegal content or want to challenge a decision about something they posted, the systems for doing those things don't work well enough.

Protecting Kids: A Bigger Problem

The investigation also flagged a problem that affects parents. Facebook and Instagram say you have to be at least 13 years old to have an account. But regulators found that Meta doesn't do much to check users' ages or stop younger children from signing up. Once a child gets in, Meta also isn't checking to make sure they don't see adult content.

Snapchat is facing the same questions. Like Facebook, it says you need to be 13 to use the app, but regulators doubt the company is actually verifying that when people sign up.

Europe has been working on a tool to help with age verification, though details on how it will work are still unclear.

What Happens Next

Right now, these are preliminary findings. Meta and TikTok still have a chance to fix the problems before regulators make a final decision. But the investigation covers so many different issues that both companies likely have significant work ahead.

Europe's rules allow fines up to 6% of a company's yearly earnings worldwide. That's a lot of money, and it gives these companies reason to act quickly.

The broader picture here is that Europe is taking a hands-on approach to regulating the internet. These investigations aren't one-time checks—they appear to be the start of ongoing oversight. Other big platforms should probably expect similar scrutiny in the coming months.

Historically, large tech platforms have fought against new regulations at first, then adjusted their practices over time. We've seen this cycle before with earlier privacy rules in Europe. The same pattern may play out here, though the stakes are higher because the rules are more detailed and the penalties are more severe.