Min menu


Facebook tests tools to combat child exploitation


Facebook has made it clear that it is testing more tools to help the social network combat child exploitation and prevent people from sharing content that exploits children.

One of the tools is a pop-up message that Facebook plans to display through its apps for people who use search terms related to child exploration.

The letter details the consequences of viewing this content and provides information on how to get help from the organizations concerned.

The other activity focuses on the harmless sharing of child exploitation content so that people who share these materials see safety alerts sought about the damage they can cause.

The alert includes a warning that the content violates Facebook's rules as well as the legal implications for sharing such material.

Facebook removes content and reports it to the NcMEC National Center for Missing and Exploited Children, and deletes accounts that promote such material.

Facebook has updated its policies on child safety, with accounts, pages, groups, and Instagram accounts dedicated to sharing innocent images of children with captions, hashtags, or comments containing inappropriate references to children.

While photos or videos shared by people may not violate Facebook rules, the accompanying text can better help the social network determine whether the content sexualizes children and whether the account, page, or group should be removed.

In addition, the company has updated its list of reports on Facebook and Instagram, and users can select an option called "child", within the nudity and sexuality section.

Facebook explains that material reported in this way is a priority for content reviewers.

It has also adopted Google's Content Security API to discover when posts may contain child exploitation and give priority to reviewers.

The company has long used different detection systems to root out child-abuse content and find potentially inappropriate procedures for children or potential cases of child grooming.

Facebook says it is looking for networks that violate its child exploitation rules, in a way similar to the way it deals with unauthentic, coordinated behavior and dangerous organizations.