Reddit and Discord are currently at the center of a debate about age verification, data privacy, and legal obligations. Both platforms used a controversial external age verification service – with significant consequences. While Reddit had to pay a million-dollar fine, Discord faced massive criticism from its own community.
At its core, the issue revolves around a difficult question: How can children be effectively protected on the internet without unnecessarily disclosing personal data or violating data protection laws?
The starting point for these current developments is the British Online Safety Act (OSA). This law obliges online platforms to verify the age of their users and ensure that children are not exposed to harmful or inappropriate content.
For platforms like Reddit and Discord, this means they must implement reliable age verification mechanisms. At the same time, they are under pressure to store as little sensitive data as possible themselves. This balancing act between protecting minors and data privacy has now led to significant problems.
Inadequate controls at Reddit
Reddit has been fined £14.47 million (around $19.5 million) by the UK's data protection authority, the Information Commissioner's Office (ICO). The platform was accused of unlawfully processing children's personal data (via BBC News).
According to an investigation by the ICO, Reddit did not properly verify the age of its users. This created a risk that minors could access content that was unsuitable or potentially harmful to them.
To comply with legal requirements, Reddit contracted the external company Persona to handle age verification. Persona was tasked with checking either uploaded selfies or photos of official identification documents. Reddit explained that this solution was chosen to avoid storing direct information about the users' identities.
The ICO concluded, however, that these controls were insufficient. Many children were incorrectly classified as adults. Consequently, personal data of minors was processed without a legal basis. This point alone led to the substantial fine.
This case demonstrates that outsourcing sensitive processes to third-party providers does not absolve a company of responsibility. The obligation to correctly implement age verification remains with the platform itself.
Discord is experiencing a backlash from users
Discord also used Persona for age verification. Unlike Reddit, however, the primary concern here was not a government penalty, but rather a significant backlash from users.
Discord has faced criticism on social media for not being transparent about its planned use of facial scans and uploaded IDs. The partnership with Persona has been particularly criticized. Persona is used not only by Reddit but also by other platforms like Roblox.
Users pointed to Persona's privacy policy, which states that the company may obtain personal data from "third-party databases, government registers, and other publicly available sources." This statement heightened concerns that sensitive identity data could be processed more extensively than anticipated.
Savannah Badalich, Discord's head of product policy, confirmed that there had been a limited test of Persona in the UK. This test was conducted because age verification was already legally mandated there. According to Badalich, the test has now ended, and Discord no longer uses Persona.
The Online Safety Act as the trigger for the development
The British Online Safety Act provides the legal basis for stricter age verification measures. The aim of the law is to effectively protect children from harmful online content.
However, implementation presents practical and data protection challenges. The stricter the age verification, the more frequently users have to upload sensitive documents such as official photo IDs or video selfies. If such data is shared with numerous app developers or platforms, there is an increased risk of data breaches.
The current situation illustrates that legal regulations alone do not offer a simple solution. They partially shift the problem to a technical and organizational level.
Demands for accountability from Apple and Google
Given the difficulties faced by platforms like Reddit and Discord, there are increasing calls to shift more responsibility for age verification to the operators of the app stores.
Specifically, this concerns Apple and Google. The idea is that age verification would be handled centrally through the app stores. Users would only have to prove their age once. Afterwards, age-restricted apps could be automatically blocked.
Some US states already have similar regulations. Such a solution would have several implications:
- The age verification would be carried out in a bundled and standardized manner.
- Users would not have to submit their identity multiple times to different developers.
- The risk of sensitive identity data being shared with numerous parties would decrease.
- The user experience would be more consistent and less time-consuming.
At the same time, it is understandable that Apple in particular is hesitant to assume this responsibility. A centralized identity verification process would entail considerable technical, legal, and organizational requirements.
Reddit and Discord caught in the crossfire of new legal requirements
The cases of Reddit and Discord clearly demonstrate the complexity of implementing age verification in the digital space. Reddit was fined heavily for inadequate controls, as children were incorrectly classified as adults and their data was processed unlawfully. Discord, in turn, faced massive criticism after details of its collaboration with Persona were publicly discussed.
The UK's Online Safety Act increases pressure on platforms to implement effective safeguards. At the same time, concerns about data privacy are growing as users are forced to share official identification and biometric data with numerous companies.
The debate surrounding a centralized age verification system via Apple or Google is therefore likely to gain further importance. One thing is clear: protecting children online is essential. However, it is equally clear that this protection must not lead to the uncontrolled collection of sensitive personal data. (Image: Shutterstock / Tada Images)
- Apple TV unveils trailer for season 5 of For All Mankind
- Apple shareholders clearly follow management
- Apple: Why iOS leaks are so rare
- iPhone to break sales record in Europe in 2025
- Apple was warned by the CIA about the risk in Taiwan
- Apple offers a glimpse into its Mac mini factory in Texas
- Apple will now manufacture the Mac mini directly in the USA
- Apple introduces Sales Coach app for iPhone & iPad
- iOS 26.4 Beta 2: All features and adjustments
- WhatsApp is working on scheduled messages
- David Pogue offers insights into "Apple: The First 50 Years"
- Anthropic raises serious allegations against China
- iOS 26.4: Two apps get the old search layout
- iOS 26.4 Beta 2 tests secure RCS chats with Android
- iOS 26.4 Beta 2 launches: Apple continues testing phase
- Apple TV is betting on Wonder Pets Season 2
- Apple Sports expands soccer and tournament schedules
- Apple TV shows new thriller "Unconditional"
- iOS 26.3.1: Apple tests new iPhone update
- How Trump's import tariffs are making life difficult for Apple
- WhatsApp is testing a feature to hide texts
- Apple Ferret-UI Lite: The smart way to on-device AI
- Illegal tariffs: Will Apple get its 2 billion back?



