Former TikTok moderators in the UK have filed a lawsuit against the company, claiming they were unfairly sacked just before a planned vote to establish a collective bargaining unit. The 400 content moderators who were let go were unionizing, seeking better protection from the personal toll of processing traumatic online content at high speed.
According to John Chadfield, national officer for tech workers at the Communication Workers Union (CWU), "Content moderators have the most dangerous job on the internet." They are exposed to graphic and disturbing material, including child sex abuse images, executions, and war scenes. The CWU argues that TikTok's content moderation team is understaffed and under-resourced, with workers being asked to process vast amounts of traumatic content without sufficient support.
TikTok has denied the allegations of union-busting, claiming the layoffs were part of a restructuring plan aimed at adopting AI for content moderation. However, the company has revealed that 91 percent of transgressive content is now removed automatically by its AI system. The move to adopt AI has been criticized by workers and trade unions, who argue it undermines human judgment and expertise in moderating online content.
In August, TikTok announced a restructuring exercise just as hundreds of moderators were organizing for union recognition. John Chadfield described the timing of the announcement as "smelling of union-busting" and claimed that corporate greed was prioritized over worker safety and the public's right to know about the content on the platform.
The lawsuit filed by former TikTok moderators in the UK alleges that the company breached UK trade union laws and engaged in unfair dismissal. The workers had sought better protections against the personal toll of their job, including more say over how they kept the platform safe and input into their workflows.
According to John Chadfield, national officer for tech workers at the Communication Workers Union (CWU), "Content moderators have the most dangerous job on the internet." They are exposed to graphic and disturbing material, including child sex abuse images, executions, and war scenes. The CWU argues that TikTok's content moderation team is understaffed and under-resourced, with workers being asked to process vast amounts of traumatic content without sufficient support.
TikTok has denied the allegations of union-busting, claiming the layoffs were part of a restructuring plan aimed at adopting AI for content moderation. However, the company has revealed that 91 percent of transgressive content is now removed automatically by its AI system. The move to adopt AI has been criticized by workers and trade unions, who argue it undermines human judgment and expertise in moderating online content.
In August, TikTok announced a restructuring exercise just as hundreds of moderators were organizing for union recognition. John Chadfield described the timing of the announcement as "smelling of union-busting" and claimed that corporate greed was prioritized over worker safety and the public's right to know about the content on the platform.
The lawsuit filed by former TikTok moderators in the UK alleges that the company breached UK trade union laws and engaged in unfair dismissal. The workers had sought better protections against the personal toll of their job, including more say over how they kept the platform safe and input into their workflows.