Three executives from Majorel SP Solutions, S.A.U., a Barcelona-based company that moderates and filters content for TikTok, appeared in court this Monday. They denied that highly disturbing content was reviewed from the Catalan capital.

They testified after an employee filed a lawsuit against four company executives for crimes against workers, moral integrity and serious injuries through negligence. The employee alleged the company imposed absolutely inhuman and unbearable working conditions on staff.
The defendants denied that Barcelona moderators viewed highly disturbing TikTok content. They said this task was carried out outside Spain, though they didn’t specify where. According to the lawyer, they claimed extremely violent content was residual at the Barcelona office, accounting for around 1% of the total material employees had to review.
Asked how disturbing content was separated from non-disturbing material, they said it was a technical matter they didn’t know about. The employee started working at the company in 2019, hired as a telephone operator without receiving preventive training for the psychosocial risks of the role she would perform moderating TikTok content.
She maintains she was never told she would have to view extremely violent content, much less constantly throughout her working day. According to the lawsuit, she initially reviewed videos from the German market, then the Spanish market, filtering content reported by users that was catalogued with the most extreme level of violence.
She says she worked 14-hour days with only five minutes rest per hour. During the remaining 55 minutes, she had to view highly sensitive and disturbing material including murders, beheadings, dismemberments, rapes, zoophilia, child pornography, child abuse, mutilations, live suicides, torture and terrorism.
The five-minute breaks weren’t sufficient to rest, according to the worker. There was also no possibility of an extra break if she was profoundly impacted by a particular video. She adds the workload was extremely high, meaning workers could view between 800 and 1,200 violent videos in a single day.
In 2023, the worker obtained sick leave after going to A&E with panic attacks, dizziness and vomiting. She could no longer bear the exposure to highly sensitive content, having developed a mental disorder. This led to a 2024 Labour Inspection report concluding there was a cause-and-effect relationship between the omission of preventive measures and the psychological aftermath suffered by the worker.
In 2025, Social Security determined this work incapacity was derived from a workplace accident, not a common illness.
If you want news on a more convenient platform, you can follow us on Reddit, X or our WhatsApp broadcast channel.