Child sexual abuse content and online threats to children are increasing in the digital age [1]. This includes minors taking and sharing sexual images of themselves and risky online interactions between youth and adults [3]. A recent report from Thorn [1], a technology nonprofit [1], highlights alarming trends in online child endangerment [1], such as the rise in self-generated child sexual abuse material (CSAM) and the audacious behavior of online predators [1].


Thorn’s report reveals a concerning increase in self-generated CSAM and the bold actions of online predators. These images [1], whether produced under coercion or with consent [1], circulate widely on online platforms [1], with predators exploiting them for nefarious purposes [1]. Child safety organizations have reported a significant surge in CSAM files [3], with reported cases increasing by 329% in the last five years [1], reaching over 88.3 million files in 2022 alone [1]. Additionally, over 80,000 new reports of suspected CSAM are submitted to the National Center for Missing & Exploited Children’s CyberTipline daily [2].

To combat this issue, technology plays a crucial role in detecting and disrupting the spread of CSAM. Hashing and matching techniques convert files into unique hash values and compare them against known CSAM hash lists [3]. Expanding the corpus of known CSAM hash values is essential for effective detection [3]. Tech companies and NGOs are key partners in eliminating CSAM from the internet [3], with content-hosting platforms playing a significant role. The adoption of detection tools across platforms offers hope in addressing this problem and creating a safer online space for children [1]. Safer [1] [3], a proactive CSAM detection tool [3], has already helped identify millions of pieces of CSAM [3].


The exponential growth of CSAM poses a significant challenge, as tech companies are not currently required to proactively search for [2], detect [1] [2] [3], and remove CSAM within a specific timeframe. To make a real impact, it is crucial to change the requirements and standards for how tech companies operate [2]. By doing so, we can alleviate the burden on parents and work towards a safer online environment for children.