Our latest #sextortion report sheds light on the pressure tactics used by perpetrators to create fear and shame in young victims. Financial threats and manipulated images are often weaponized to convince children their lives will be ruined if the imagery is leaked. This pressure can become a major roadblock, preventing kids from reporting or seeking help. Early and open conversations between parents or loved ones and children are crucial so children feel comfortable sharing these experiences, without fear of harsh judgment.
About us
We are Thorn. Our mission of defending children from sexual exploitation and abuse is deeply embedded within our core—a shared code that drives us to do challenging work with resilience and determination. Here, you’ll work among the best hearts and minds in tech, data, and business, creating powerful products that protect children’s futures. Unleash your own formidable talents while learning among peers and growing every day. All in a supportive environment of wellness, care, and compassion. Build your career as we help build a world where every child can be safe, curious, and happy.
- Website
-
http://www.thorn.org
External link for Thorn
- Industry
- Non-profit Organizations
- Company size
- 51-200 employees
- Headquarters
- Manhattan Beach, CA
- Type
- Nonprofit
- Founded
- 2012
- Specialties
- technology innovation and child sexual exploitation
Locations
-
Primary
Manhattan Beach, CA 90266, US
Employees at Thorn
Updates
-
Did you miss our webinar about financial #sextortion? We shared our latest research, conducted with National Center for Missing & Exploited Children, to shed light on the increased threat to children online. Highlights from this important conversation: 1️⃣ The majority of victims are boys aged 14 to 17 2️⃣ Perpetrators use a range of methods to pressure children and increase the perceived severity of their imagery being exposed 3️⃣ Of those reports in the study that described specific impacts of the experience, more than 1 in 6 included mention of self-harm or suicide. 4️⃣ Practical tips for parents to help prevent this harm and how we can all make a difference to stop financial sextortion It’s not too late to watch! Catch up on the conversation here: https://lnkd.in/gEtjYiqd
-
-
☕ For the price of a cup of coffee, your monthly gift will help Thorn: - Build technology to help law enforcement agents find victims faster - Design solutions that equip platforms to detect, review, and report child sexual abuse material (CSAM) at scale - Help stop the revictimization of children that occurs every single day - Create valuable, relevant resources for parents and youth to prevent abuse before it starts - Conduct original and groundbreaking research that helps people around the world better understand the landscape and protect children - Influence policy designed to protect children https://lnkd.in/ggv7CTaS
-
Our latest report reveals the harsh reality of financial sextortion: 38% of victims made payments to their perpetrators, but this often did not stop the harassment. In fact, 27% of those who paid experienced continued demands after the first payment. Visit our blog for resources on what to do if you or someone you know is being sextorted online: https://lnkd.in/gzW9ea8u
-
Join us tomorrow for an insightful discussion with National Center for Missing & Exploited Children about the alarming rise of financial sextortion cases, insights into exploitation tactics, support systems and resources and learn how to make a difference. Register now: https://lnkd.in/gD7hBPqT
-
-
Our latest research, in collaboration with the National Center for Missing & Exploited Children (NCMEC), highlights a disturbing rise in financially motivated #sextortion, predominantly targeting boys aged 14-17. Children are being exploited for monetary gain by organized groups which leverage popular platforms for victimization, threaten social shaming and even use #artificialintelligence to create explicit images. Our study emphasizes the importance of awareness, open conversations about online safety, and robust platform measures to combat these threats. Read our blog to learn more: https://lnkd.in/gWCne8wk
-
Making a monthly donation isn't a small commitment — but it's an incredibly meaningful one. Your monthly support will ensure we can: 🌐 Continue advocating for Safety by Design in generative AI technologies to prevent it from being used to perpetrate child sexual abuse. 💻 Equip more platforms to detect child sexual abuse material (CSAM). 📝 Address threats to child safety with research and resources. 🏛️ Shape policy and legislation. Join our community of Thorn Builders today: https://lnkd.in/dtdqkXBZ
-
We are thrilled to see Thorn and our child safety red teaming services highlighted in Anthropic’s latest blog. Our child safety red team sessions are designed to test AI models and identify risks and vulnerabilities related to child sexual abuse. Without child safety mitigations in place, bad actors can and do misuse generative AI technologies. Be sure to check out the article's informative overview of the benefits and challenges of Policy Vulnerability Testing for trust and safety risks.
Today, we're sharing a sample of red teaming methods we’ve used to test our AI systems. We detail challenges, findings, and the need to work towards common industry standards: https://lnkd.in/eR-6jd7Y
Challenges in Red Teaming AI Systems
anthropic.com
-
There has been an alarming rise in financial #sextortion involving youth. Brand-new Thorn research, to be released on June 24, gives insights into just how pervasive this devastating form of exploitation has become. Join us Wed, June 26 at 3 p.m. ET for a critical conversation to learn about: • The alarming rise in financial sextortion cases • Insights into exploitation tactics • Support systems and where to find help for victims • How to make a difference Register now: https://lnkd.in/ggySADR5
-