Facebook reports millions of photos and videos of suspected child sexual abuse each year. But when ages are unclear, young people are treated as adults and the images are not reported to the authorities. Meta, the parent company of Facebook, Instagram, Messenger and WhatsApp, has instructed content moderators for its platforms to “err on the side of an adult” when they are uncertain about the age of a person in a photo or video, according to a corporate training document.
Head of safety for Meta, Antigone Davis said the policy stemmed from privacy concerns for those who post sexual imagery of adults. Ms. Davis emphasised that Meta employs a multilayered, rigorous review process that flags far more images than any other tech company.
She said the consequences of erroneously flagging child sexual abuse could be “life changing” for users. While it is impossible to quantify the number of images that might be misclassified, child safety experts said the company was undoubtedly missing some minors.
Technology companies are legally required to report “apparent” child sexual abuse material, but “apparent” is not defined by the law. Executive director of the Canadian Center for Child Protection, Lianna McDonald said lawmakers in Washington need to establish a clear and consistent standard for everyone to follow.
CyberBeat is a grassroots initiative from a team of producers and subject matter experts, driven out of frustration at the lack of media coverage, responding to an urgent need to provide a clear, concise, informative and educational approach to the growing fields of Cybersecurity and Digital Privacy.
If you have a story of interest, a comment, a concern or if you'd just like to say Hi, please contact us
We couldn't do this without the support of our sponsors and contributors.