Community Signal

Building a Database of CSAM for AOL, One Image at a Time

Informações:

Synopsis

If you work in content moderation or with a team that specializes in content moderation, then you know that the fight against child sexual abuse material (CSAM) is a challenging one. The New York Times reported that in 2018, technology companies reported a record 45 million online photos and videos of child sexual abuse. Ralph Spencer, our guest for this episode, has been working to make online spaces safer and combatting CSAM for more than 20 years, including as a technical investigator at AOL. Ralph describes how when he first started at AOL, in the mid-’90s, the work of finding and reviewing CSAM was largely manual. His team depended on community reports and all of the content was manually reviewed. Eventually, this manual review led to the creation of AOL’s Image Detection Filtering Process (IDFP), which reduced the need to manually review the actual content of CSAM. Working with the National Center for Missing and Exploited Children (NCMEC), law enforcement, and a coalition of other companies, Ralph shar