The European Union has found that Meta and TikTok have failed to adequately protect children on their platforms. According to the European Commission's preliminary findings, both companies violated the Digital Services Act (DSA). The report shows that it was difficult for users to report illegal content, such as child sexual abuse material. The platforms also allegedly made it difficult for researchers to access important data needed to verify the protection of minors.
The EU's Digital Services Act is intended to ensure that large online platforms take responsibility for the content on their services. It obliges companies like Meta and TikTok to combat harmful or illegal content and create transparent structures for research and oversight. However, the EU's findings show that significant shortcomings exist in practice. According to the research, children and young people continue to be exposed to risks, while important control mechanisms have been blocked or obstructed.
Violations of the Digital Services Act
According to the European Commission 's preliminary findings, Meta and TikTok have violated key obligations under the Digital Services Act. Both companies are alleged to have made it difficult for researchers to access public data. This data is necessary to investigate whether children are exposed to illegal or harmful content on the platforms. The EU found that the procedures and tools provided by Meta and TikTok for data access are cumbersome and unreliable. In many cases, researchers received only incomplete or inaccurate data, making independent investigations virtually impossible.
The Commission emphasizes that the DSA requires platforms to grant researchers appropriate access to verify user protection. By obstructing this, Meta and TikTok, according to the EU, violate the principles of transparency and accountability.
Inadequate reporting options and „dark patterns“ in Meta
Another point of criticism concerns the handling of illegal content, particularly child sexual abuse material. According to the EU Commission, Meta has made it difficult for users to report such content on Facebook and Instagram. Both platforms apparently do not offer a user-friendly "notice and action" mechanism that makes it easier to report illegal content. Instead, the reporting processes are complicated and confusing.
The Commission accuses Meta of using so-called "dark patterns" – design tricks that intentionally mislead users or prevent them from performing certain actions. In this case, the dark patterns would have made the reporting process for illegal content more difficult. Meta is therefore suspected of violating the obligations of the DSA, which stipulates clear and simple reporting channels.
Both companies, Meta and TikTok, now have the opportunity to respond to the allegations. If their answers are not convincing, they face fines of up to six percent of their global annual revenue.
Internal problems at Meta and new lawsuits in the USA
In parallel with the EU investigations, Meta is also under pressure in the US. Several states are accusing the company of intentionally designing its platforms to be addictive. Internal company research reportedly showed in 2021 that Instagram, in particular, has negative effects on the mental health of teenagers—especially young girls. However, these findings are said to have been suppressed.
When the results became public through research by the Wall Street Journal, Meta denied that the studies had been accurately reported. Nevertheless, several U.S. states filed lawsuits accusing Meta of knowingly developing products harmful to teenagers.
A court in Washington DC has now decided, that Meta cannot invoke attorney-client privilege to conceal internal documents from investigators. Judge Yvonne Williams found that Meta's lawyers allegedly advised researchers to "remove," "block," or "lock up" incriminating findings. According to Williams, these communications fell under the attorney-client privilege exception because they were intended to conceal potential liability or commit fraud. The first of the lawsuits against Meta is scheduled to go to trial next year.
Possible consequences for Meta and TikTok
The EU investigations are ongoing. If the suspicions are confirmed, Meta and TikTok could face heavy fines. The Digital Services Act stipulates that companies that violate key obligations can be fined up to six percent of their global annual turnover. This would send a clear message to Meta and TikTok, whose annual revenues run into the billions.
With the DSA, the EU Commission aims to ensure that large platforms take responsibility – especially when it comes to protecting young people. The Meta and TikTok cases demonstrate that this responsibility is often neglected in practice. While the companies publicly emphasize that they take security and child protection seriously, the current allegations suggest the opposite.
Child protection online: The EU is serious about the Digital Services Act
The results of the EU investigation make it clear that the protection of children and young people online is still not adequately guaranteed. Neither Meta nor TikTok were able to demonstrate that they meet the requirements of the Digital Services Act. The obstacles for researchers, the complicated reporting processes, and the potential use of dark patterns cast a negative light on the companies' approach to responsibility and transparency.
With the Digital Rights Act (DSA), the EU has created a powerful tool to hold digital platforms accountable. If Meta and TikTok continue to fail to meet their obligations, the Commission will impose fines and consider more stringent measures. This case demonstrates that the days when big tech companies could act without serious consequences are over. The EU wants to ensure that children and young people are not just users in the digital space, but also protected. (Image: Shutterstock / PV productions)
- Apple remains strong: JP Morgan now sees price target at $290
- Apple held an exclusive Vision Pro event for developers
- Swift SDK: Android support marks a turning point for Apple
- Apple significantly expands AirPods production in India
- iOS 26.1 introduces stable photo backups for third-party apps
- Apple in leak trial: Fanboy seeks out-of-court solution
- M5 MacBook Pro teardown: top performance, repair flop
- Apple surprises with strong growth in Mac sales
- Apple accelerates US production for its own AI servers
- Vimeo expands 3D support for Apple Vision Pro
- Is Apple withdrawing app tracking protection due to EU pressure?
- Future iPhone could have greater NFC range
- iPhone Air: Production drops drastically after sales slump
- Apple stock in focus: Wells Fargo raises target to $290
- Apple Vision Pro not eligible for trade-in, according to Apple
- Apple is working on a tool for app data migration to Android
- iPhone Air weakens – Apple relies on other models




