Facebook said to "er on the side of an adult" during age uncertainty in possible abuse photos
The reason why activist says about the regulators is not enough trying to protect children in some cases of a minor. Facebook platform is not complying with the rules or regulations for the security of children.
Highlights
- Many tech companies including Facebook are lacking a child safety factor
- A policy was made for Facebook content moderators working at Accenture
- Interviewees described a policy called “bumping up”
Also Read: 'Bare - Knuckle' fight between Facebook and TikTok.
These days, tech companies should deal with technology’s bad impact on minor children and their safety, with full concentration. Companies have a major responsibility to monitor content on their platforms for child Sexual abuse material (CSAM). The liable company if it is found to be involved in such activity should be a complaint against such company from National Center for Missing and Exploited Children (NCMEC).
Facebook, a giant social media platform has a policy that tells that it is reporting child sexual abuse content. As per the new report from the New York Times, Facebook has a training document that directs content moderators to “err on the side of an adult” when they do not know about the age in the photo or video, had a chance to be CSAM, the report includes.
The policy that was made for Facebook content moderators used to work at Accenture is discussed in Califonia Law Review.
An interviewee said about a policy called ‘Bumping up’ which each of them personally disagreed with. Bumping Up policy is used when a content moderator is readily unable to determine whether the age of that picture is minor or major and in such a situation content is to be instructed to assume the subject is an adult and thereby allowing more images to go unreported to NCMEC.
Also Read: BSNL 4G Services to be Rolled Out Soon, Says Minister of State for Communications