
Transparency Theatre
The numbers are staggering and increasingly meaningless. In the first half of 2025, TikTok's automated moderation systems achieved a 99.2 per cent accuracy rate, removing over 87 per cent of violating content before any human ever saw it. Meta's Q4 2024 transparency report showed content restrictions based on local law dropping from 84.6 million in the second half of 2024 to 35 million in the first half of 2025. YouTube processed 16.8 million content actions in the first half of 2024 alone. X reported suspending over 5.3 million accounts and removing 10.6 million posts in six months. These figures appear in transparency dashboards across every major platform, presented with the precision of scientific measurement. Yet beneath this veneer of accountability lies a fundamental paradox: the more data platforms publish, the less we seem to understand about how content moderation actually works, who it serves, and whether it protects or harms the billions of users who depend on these systems
Continue reading on Dev.to
Opens in a new tab


