We’re on a Mission to Eliminate CSAM from the Internet.

There is a child behind every file, hash, and piece of content. To date, Safer has identified 5+ million potential CSAM files on customer platforms.

Together, we’re building a safer internet.

Why Safer

With a relentless focus on CSAM elimination strategies, Safer helps protect content-hosting platforms and their users from the risks of hosting child sexual abuse images and videos.

Our industry-leading CSAM detection solutions are powered by:

  • Proprietary research and issue expertise
  • A large database aggregating 57+ million hashes of known CSAM
  • Cross-platform CSAM hash sharing
  • Advanced AI machine learning models
See how it works

What Makes Safer Different

  • Issue expertise

    Top content-hosting platforms rely on our issue expertise, proprietary research, and data science proficiency.

  • Matching accuracy

    Advanced hashing techniques and matching against a database aggregating 57.3 million known CSAM hash values provide highly accurate detection results.

  • AI trained on real CSAM

    Thorn’s child abuse model is trained in part using trusted data from the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline.

The Issue

In 2023, the National Center for Missing and Exploited Children’s CyberTipline received over 104 million files of suspected child sexual abuse material (CSAM) from electronic service providers (ESPs) alone.

Reports from ESPs constitute the majority of reports received by NCMEC and show that content-hosting platforms are critical partners in addressing this issue.

Platform protection remains inconsistent across the industry. Companies that attempt to build their own solution quickly learn that the endeavor is costly and ineffective due to incomplete or siloed data sets. The result is ineffective solutions that only partially address the issue for content-hosting platforms.

See how Safer customers are making an impact.

Why Thorn Made Safer

Thorn built safer to tackle the growing CSAM issue.

With Safer, Thorn is equipping content-hosting platforms with industry-leading CSAM detection solutions to protect their platforms and users.

Hear more about Thorn’s vision in CEO Julie Cordua's TED talk.

Learn More About Thorn
 A close-cropped portrait of Chris Hauser
“GoDaddy is proud to be a part of Thorn’s Safer community. Using their services, we can detect and remove CSEA content faster and safely share knowledge in the community in order to keep the Internet a safe and enjoyable place, especially for children.”
Chris Hauser, Director – Infosec at GoDaddy

Ready to help build a safer internet?

Let's chat about putting Safer to work for your platform.

Get in Touch
You've successfully subscribed to Safer: Proactive Solution for CSE and CSAM Detection!