The Dutch Interior Ministry’s recent study aims to enhance content moderation on major online platforms, prompted by TikTok’s closure of its Amsterdam office, where around 300 employees were based. State Secretary Zsolt Szabó expressed concerns that too many social media platforms haven’t adequately invested in local linguistic and cultural expertise, particularly in the Netherlands. This investigation will explore optimal strategies for content moderation and is expected to conclude in autumn 2025, with findings shared with the European Commission. The ministry’s engagement with TikTok aims to address concerns related to its operational changes and its impact on content moderation capacity in the region.
In light of the layoffs, Szabó asserted that TikTok assured the Dutch government the redundancies would not significantly reduce its overall number of moderators across the European Union. He emphasized the importance of monitoring the situation, suggesting that if TikTok’s current information proves inaccurate or indicates a reduction in Dutch moderators, further scrutiny by the European Commission could be warranted to address increasing systemic risks. The ongoing dialogue between the Dutch government and TikTok highlights the need for transparency regarding the potential fallout from operational changes within the company.
TikTok’s content moderation is primarily automated, with human moderators stepping in when the system flags a potential violation. This dual approach is designed to enhance the platform’s machine learning capabilities while providing a nuanced understanding of context and culture. According to recent reports, the company has 160 content moderators responsible for Dutch language content, but they are not necessarily located in the Netherlands. This operational structure raises questions about the company’s commitment to localized content moderation and its ability to address specific cultural contexts effectively.
Notably, as of the first half of this year, TikTok reported employing over 6,000 employees dedicated to content moderation across the EU. Although there was an increase in the overall number of moderators, there were notable fluctuations in specific language groups, particularly English. The overall number of English language moderators decreased significantly, while the count of non-language specific moderators rose sharply. However, the lack of dedicated moderators for less-spoken languages poses challenges to fair content assessment and highlights a gap in TikTok’s content moderation strategy.
The lack of sufficient moderators for certain languages, such as Maltese and Irish, alongside minimal resources for Estonian and Croatian, raises concerns about TikTok’s ability to deliver culturally and linguistically relevant moderation. As users communicate in diverse languages across the platform, an absence of adequate moderation staff can lead to potential misrepresentation and inadequate handling of language-specific issues. Despite TikTok’s assertions of having language capabilities for every EU member state, disparities in moderator distribution suggest that additional investments are necessary for comprehensive coverage.
The unfolding situation with TikTok illustrates the broader challenges faced by social media platforms in content moderation, especially in ensuring localized support and context-specific understanding. The Dutch government’s study highlights a proactive approach toward evaluating moderation practices and the necessity for social media companies to align their operations with regional linguistic and cultural contexts. As the digital landscape continues to evolve, the need for sustainable investments in human moderation becomes increasingly vital to safeguard and elevate community standards across platforms like TikTok.