Extension:MediaModeration

The MediaModeration extension for filtering and removing possible child exploitation content.

The purpose is to improve the Foundation’s existing workflows for child protection content.

Currently, when the Foundation receives a report of images that depict child sexual abuse, we delete it from the projects and report it to law enforcement according to our legal requirements. This setup requires volunteers, who unlike staff have no professional training or mental health support, to initially deal with this very emotionally taxing content.

This MVP aims to protect the community from being exposed to such content in nearly all cases and get it off the platform a lot faster. It would check images against a database of hashed, known images of child sexual abuse to allow Foundation staff to remove them and report their existence to law enforcement.

This MVP could eventually plug in to other Trust & Safety workflows dealing with terrorism content for Foundation staff to review to see if they meet our existing criteria for credible threats of immediate harm.

This MVP doesn't not automatically remove any content without human review by Foundation staff.

Functionality
MediaModeration provides the following:


 * Check uploaded image against photoDNA
 * Send email to recipients if suspicious content found
 * Allows sending 160x160 thumbnails instead of full image