If you’re headed to #TrustCon, be sure to add our panel to your agenda. Dr. Rebecca Portnoff, Thorn’s Head of Data Science will be joined by trust and safety leaders from All Tech is Human, OpenAI, Invoke and Google for a panel discussion about developing and enacting Safety by Design principles for generative AI. They will share learnings they have from the implementation process and progress they have made against these commitments–including the quick wins and the difficulties. #TrustandSafety #AI https://lnkd.in/d_yucYaP
Safer, Built by Thorn
Software Development
El Segundo , California 1,374 followers
Proactive child sexual abuse material (CSAM) detection built by experts in child safety technology
About us
Safer was built by Thorn to fill the need for a solution that could adequately tackle the viral spread of child sexual abuse material (CSAM). With Safer, any platform with an upload button can access industry-leading tools for proactive CSAM detection. Thorn’s proactive solutions are powered by innovative tech, trusted data, proprietary research, and an expansive network of partnerships within the child safety ecosystem. Platforms don't have to tackle this issue alone. We can take meaningful action together. With a relentless focus on CSAM elimination strategies via advanced AI/ML models, proprietary research, and a cutting-edge detection tool, Safer enables businesses to come together and protect children online.
- Website
-
https://bit.ly/3BNWroZ
External link for Safer, Built by Thorn
- Industry
- Software Development
- Company size
- 51-200 employees
- Headquarters
- El Segundo , California
- Founded
- 2019
- Specialties
- Child safety, Platform safety, Online safety, Content identification, Image identification, Video identification, and Content moderation
Updates
-
In the last couple of years, trust and safety teams have witnessed numerous trends emerge that have impacted child safety. As we continue to adapt and innovate, your expert insights are more important than ever. If you don’t see your answer listed, please comment below!
This content isn’t available here
Access this content and more in the LinkedIn app
-
Since online communities are already highly frequented by children, the urgency to safeguard them from exploitation and abuse and proactively address the risks to your platform has never been more critical. Whether you are building a platform from scratch or refining existing policies, we share the key considerations for fortifying your child safety measures: https://lnkd.in/gWvTKnc5
-
-
In addition to identifying known #CSAM, our classifiers use machine learning to predict whether new content is likely to be CSAM and flag it for further review by content moderators. In 2023, our customers classified 1,546,097 images and videos as potential CSAM. Read our full impact report: https://lnkd.in/g9bCpySG
-
-
As we prepare for our upcoming trust and safety insights session, “Combating Financial Sextortion: The Latest Trends and Proactive Solutions,” we want to ensure it addresses the questions you have about mitigating this risk on your platform. What questions do you have about financial sextortion? 👉 Share your questions in the comments below. 👉 Don’t forget to reserve your spot for a chance to have your questions answered live by our experts, Thorn’s Melissa Stroebel and 🦄 Amanda H. Volz, in partnership with National Center for Missing & Exploited Children. This is your opportunity to shape the conversation and gain tailored insights to protect your community. Join us! Reserve your spot now: https://lnkd.in/dNZwMcCX #TrustAndSafety #Webinar #FinancialSextortion
-
-
Join Thorn’s Rob Wang and 🦄 Amanda H. Volz for a webinar about machine learning solutions for proactive detection of online sexual harms against children. Why should you attend? 1️⃣ Understand the current trends: See the trend lines for online sexual harms against children (image, video, text, etc.). 2️⃣ Learn Proactive Solutions: Learn about available machine learning solutions for proactive detection. 3️⃣ Expert Insights: Discover additional strategies to consider for a holistic approach to online child safety. This is an opportunity for trust and safety professionals to equip themselves with the knowledge and tools to mitigate this risk effectively on their platforms.
This content isn’t available here
Access this content and more in the LinkedIn app
-
🗓️ T&S Insights Webinar | July 31 Join Thorn’s Rob Wang and 🦄 Amanda H. Volz for a webinar about machine learning solutions for proactive detection of online sexual harms against children. Why should you attend? 1️⃣ Understand the current trends: See the trend lines for online sexual harms against children (image, video, text, etc.). 2️⃣ Learn Proactive Solutions: Learn about available machine learning solutions for proactive detection. 3️⃣ Expert Insights: Discover additional strategies to consider for a holistic approach to online child safety. This is an opportunity for trust and safety professionals to equip themselves with the knowledge and tools to mitigate this risk effectively on their platforms. Reserve your spot: https://lnkd.in/gVexFDMD #trustandsafety
-
-
Our customers detected more than 2 million images and videos of known child sexual abuse material (CSAM) in 2023. We also give our customers the option to share their hash values with Safer’s community to help diminish the viral spread of #CSAM. By detecting and reporting CSAM, our customers are helping to build a safer internet for everyone. Read our full impact report: https://lnkd.in/g7Rpmt-q
-
-
📣Thorn’s 🦄 Amanda H. Volz shares her thoughts on our latest Trust & Safety Insights Brief and why she thinks this brief is a must-read for safeguarding your platform from the risks of bad actors targeting children. Check out her thoughts in the video below! 👇 #TrustAndSafety #OnlineSafety #ThornInsights Download our latest brief: https://lnkd.in/g-ccVmh4
-
Safer, Built by Thorn reposted this
We are pleased to invite you to a webinar titled “Tech for Good: Using Machine Learning to Detect Sexual Harms Against Children.” Join Amanda Volz and Rob Wang from Thorn as we explore the use of machine learning for the detection of CSAM and CSE. Discover Thorn’s newest solution designed to proactively detect text-based harms and policy-violating messages that include discussions of sextortion, self-generated CSAM, access to children and more. In this session, we will explore: - The rise in online sexual harms, such as CSAM and CSE - Machine learning solutions for detecting images and text-based harms - Additional layers to consider for a holistic approach to online child safety. Date/Time: Thursday, June 27th at 12pm ET/9am PT. Register here! https://lnkd.in/e3H4f8Pp