fbpx
Friday, May 27, 2022 | 09:18 pm
blank

When in Doubt, Facebook Administrators “Err on the Side of an Adult” When It Comes to Possible Abuse Photos

0
When in Doubt, Facebook Administrators “Err on the Side of an Adult” When It Comes to Possible Abuse Photos

A major responsibility for tech companies is to monitor content on their platforms for child sexual abuse material (CSAM), and if any is found, they are legally required to report it to the National Center for Missing and Exploited Children (NCMEC). Many companies have content moderators in place that review content flagged for potentially being CSAM, and they determine whether the content should be reported to the NCMEC.

However, Facebook has a policy that could mean it is underreporting child sexual abuse content, according to a new report from The New York Times. A Facebook training document directs content moderators to “err on the side of an adult” when they don’t know someone’s age in a photo or video that’s suspected to be CSAM, the report said.

The policy was made for Facebook content moderators working at Accenture and is discussed in a California Law Review article from August:

When reached for comment, Facebook (which is now under the Meta corporate umbrella) pointed to Davis’ quotes in the NYT. Accenture didn’t immediately reply to a request for comment. Accenture declined to comment to The New York Times.

Update March 31st, 9:09 PM ET: Facebook pointed to Davis’ quotes in the NYT.

MORE FROM THE VERGE

  • Writing Google reviews about patients is actually a HIPAA violation
  • Twitter explores letting two accounts co-author a tweet
  • Samsung details how its TVs will become NFT gateways
  • Apple’s newest subscription service is an IT management package for small businesses
  • The Amazon Bessemer union election is going into overtime
  • Uber’s annual product event will focus on travel, sustainability
98,656FansLike
643FollowersFollow
9,151FollowersFollow