Extension:MediaModeration

The MediaModeration extension for detecting possible child exploitation content.

Purpose
The purpose of the extension is to improve the Foundation’s existing workflows for child protection content.

Prior to deployment of this extension, when the Foundation receives a report of images that depict child sexual abuse, the images are deleted from the projects and reported to law enforcement according to legal requirements. This setup requires volunteers, who unlike staff have no professional training or mental health support, to initially deal with this very emotionally taxing content.

This extension aims to protect the community from being exposed to such content in nearly all cases and get it off the platform a lot faster. It would check images against a database of hashed, known images of child sexual abuse and notify Foundation staff of hash matches to allow Foundation staff to remove the images and report their existence to law enforcement.

This extension does not automatically remove any content without human review by Foundation staff.

Functionality
MediaModeration provides the following:


 * Check uploaded image against PhotoDNA
 * Send email to pre-configured recipients if suspicious content found
 * Allows sending 160x160 thumbnails instead of a full image

Pre-requisites
Before installation, the PhotoDNA subscription key should be obtained on Microsoft cloud portal.

Configuration
After it is installed, the extension must be configured.