Webinar Tech for Good: Using Machine Learning to Detect Sexual Harms Against Children JULY 31 AT 12:30 P.M. PT | Featuring Rob Wang, Senior Manager of Data Science and Amanda Volz, VP of Customers and Strategic Partnerships
Tools CSAM Keyword Hub In partnership with the Tech Coalition, Thorn has developed an API containing child sexual abuse material (CSAM) terms and phrases in multiple languages to improve your content moderation process.
Product Updates Safer’s 2023 Impact Report In 2023, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Webinar Combating Financial Sextortion: The Latest Trends and Proactive Solutions July 10 at 12 p.m. PT | Hosted by Thorn, in collaboration with the National Center for Missing and Exploited Children
Learn The Dual Role of Technology: Thorn’s Insights From NCMEC’s 2023 CyberTipline Report In 2023, NCMEC’s CyberTipline received a staggering 36.2 million reports of suspected child sexual exploitation.
Learn Youth Tell the Truth About Safety Tools: Advice on How to Improve These Tools From Actual Teens Platform safety tools — like blocking and reporting — are often a child’s first choice for responding to a harmful sexual interaction online. Instead of seeking support from a parent or
Learn Safeguard Youth and Protect Your Platform Understand how to prevent risky situations involving youth and bad actors with insights from Thorn’s latest brief for digital platforms.
Learn The REPORT Act Is Now Federal Law – Here’s What It Means for Online Platforms The REPORT Act is now federal law. We provide details about its components and explain how it will impact online platforms.
Learn Unmasking the Perpetrators Online: Profiles of Bad Actors for Use by Trust and Safety Learn how easy access to children online has given rise to new types of child sexual perpetrators.
Learn 4 Considerations for Improving Your Child Safety Policies 4 considerations for Trust and Safety teams at digital platforms as they review their child safety policies. Here’s what to consider.
Learn Understanding Gen AI’s Risks to Child Safety on Digital Platforms In the last two years, generative AI has seen unprecedented advances. The technology ushered in the ability to create content and spread ideas faster than ever before. Yet these same capabilities present critical implications for child safety. In short, generative AI is introducing new
Learn The Kids Online Safety Act (KOSA) Explained: What the Drafted Bill Could Mean for Your Online Platform Thorn's policy team explains the Kids Online Safety Act (KOSA) and how the provisions in this bill may impact digital platforms.
Learn Key Takeaways from the Online Child Sexual Exploitation Hearing with Social Media CEOs On January 31, the CEOs of Meta, TikTok, Snap, and Discord testified during the hearing, "Big Tech and the Online Child Sexual Exploitation Crisis."
Product Updates Thorn’s Head of Data Science discuss how machine learning can support child safety on content-hosting platforms Watch Dr. Rebecca Portnoff’s keynote at AWS re:Invent 2023 to learn how Thorn is using machine learning to detect CSAM.
Product Updates Introducing Safer Essential, API-Based CSAM Detection Announcing the launch of our API-based solution for proactive detection of child sexual abuse material (CSAM): Safer Essential.
Case Study VSCO Uses Safer to Protect Its Platform and Community of Creators from CSAM at Scale For VSCO, building Safer into its infrastructure unlocked automated solutions and moderation efficiencies for VSCO’s trust and safety and content moderation teams.
Learn Hashing and Matching is Core to Proactive CSAM Detection Detect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.
Emerging Trends Report 2023 Emerging Trends Report 2023 New report highlights findings from Thorn’s latest research and offers recommendations for addressing online sexual threats to children.
Learn Comprehensive CSAM Detection Combines Hashing and Matching with Classifiers Addressing CSAM requires scalable tools to detect both known and unknown content.
Product Updates Safer’s 2022 Impact Report In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Case Study Flickr Uses CSAM Image Classifier to Find Harmful Content Flickr’s Trust & Safety team uses Safer’s CSAM Image Classifier to detect and remove previously unknown child abuse content from their platform.
Product Updates Announcing RCMP Reporting via Safer Detect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.
Learn Safer’s Self-Hosted Deployment Provides Control, Security and Scalability Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.
Learn Optimize CSAM Detection with SaferList Safer’s self-managed hashlists help customers optimize their CSAM detection. SaferList helps fill the gap between when new CSAM is reported and matched against.
Product Updates Safer’s 2021 Impact Report In 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.