Daily Tech News, Interviews, Reviews and Updates

The moderators of TikTok claim that during training, they watched footage of child sexual abuse

According to Forbes research, TikTok’s moderation crew allegedly gave unauthorized photographs and videos wide-ranging, unprotected access, raising concerns about how it manages child sexual abuse material.

Employees of Teleperformance, a company that provides third-party moderation services and works with TikTok among other firms, claim that the company ordered them to check a troubling spreadsheet titled DRR or Daily Required Reading on TikTok moderation guidelines. The spreadsheet purportedly featured “hundreds of images” of minors being molested or naked, in violation of TikTok’s rules. The workers claim that hundreds of people at TikTok and Teleperformance had access to the contents both inside and outside of the office, which allowed for wider exposure.

The moderators of TikTok claim that during training, they watched footage of child sexual abuse

According to TikTok, its training materials include “strict access restrictions and do not feature visual instances of CSAM,” while Teleperformance disputed to Forbes that it exposed its staff to sexually exploitative information. TikTok did not confirm that all third-party vendors adhered to this standard. “Content of this nature is abhorrent and has no place on or off our platform, and we aim to minimize moderators’ exposure in line with industry best practices. TikTok’s training materials have strict access controls and do not include visual examples of CSAM, and our specialized child safety team investigates and makes reports to NCMEC,” TikTok spokesperson Jamie Favazza told sources.

The employees’ account is different, and as Forbes explains, it’s a risky legal one. CSAM that is posted on numerous social media sites is a regular problem that content moderators must deal with. However, depictions of child abuse are forbidden in the US and must be used with caution. The National Center for Missing and Exploited Children (NCMEC) must be notified of the content, and companies are required to keep it for 90 days while limiting the number of people who see it.

These accusations go much beyond that. They claim that while playing fast and loose with access to such content, Teleperformance gave staff graphic images and videos as examples of what to tag on TikTok. Although it’s unclear if one was opened, one employee claims she contacted the FBI to inquire whether the practice constituted criminally disseminating CSAM.

The complete Forbes piece, which describes a situation where moderators were unable to keep up with TikTok’s fast development and were instructed to watch crimes against children for reasons they thought didn’t make sense, is definitely worth reading. It’s an odd — and, if accurate, terrifying — situation even by the convoluted norms of online discussions about children’s safety.



Readers like you help support The Tech Outlook. When you make a purchase using links on our site, we may earn an affiliate commission. We cannot guarantee the Product information shown is 100% accurate and we advise you to check the product listing on the original manufacturer website. Thetechoutlook is not responsible for price changes carried out by retailers. The discounted price or deal mentioned in this item was available at the time of writing and may be subject to time restrictions and/or limited unit availability. Amazon and the Amazon logo are trademarks of Amazon.com, Inc. or its affiliates Read More
You might also like

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More