Thorn

Thorn

Non-profit Organizations

Manhattan Beach, CA 31,331 followers

About us

We are Thorn. Our mission of defending children from sexual exploitation and abuse is deeply embedded within our core—a shared code that drives us to do challenging work with resilience and determination. Here, you’ll work among the best hearts and minds in tech, data, and business, creating powerful products that protect children’s futures. Unleash your own formidable talents while learning among peers and growing every day. All in a supportive environment of wellness, care, and compassion. Build your career as we help build a world where every child can be safe, curious, and happy.

Website
http://www.thorn.org
Industry
Non-profit Organizations
Company size
51-200 employees
Headquarters
Manhattan Beach, CA
Type
Nonprofit
Founded
2012
Specialties
technology innovation and child sexual exploitation

Locations

Employees at Thorn

Updates

  • View organization page for Thorn, graphic

    31,331 followers

    How can we practice empathy in our digital interactions? Digital mindfulness is the practice of fostering a healthy and safe online environment. It means prioritizing mental health, avoiding judgment and victim-blaming, respecting digital consent, and sharing information intentionally and securely. At NoFiltr, our youth program, we emphasize the importance of digital mindfulness for everyone, regardless of age. Together, we can create safer digital spaces by caring for ourselves and each other online. When navigating tricky online situations, remember: digital mindfulness is your friend. What does digital mindfulness mean to you? Let us know in the comments below!

  • View organization page for Thorn, graphic

    31,331 followers

    Join us August 15 for an insightful Thorn Connect livestream!   You’re invited to attend How Thorn’s AI-Driven Tech Protects Children. Join this 15-minute LinkedIn Live conversation with William Rivas-Rivas, Director of Philanthropy and 🦄 Amanda H. Volz, VP of Customers & Strategic Partnerships, where we’ll discuss how our state-of-the-art technology, Safer Predict, is transforming the fight against online child sexual abuse and exploitation.   📅 Date: August 15  🕒 Time: 12pm PT / 3pm ET  🎥 Speakers: William Rivas-Rivas and 🦄 Amanda H. Volz   In this 15-minute session, you’ll learn:   🔍 How Safer Predict makes a difference: Hear how our innovative solution is helping turn the tide on the crisis of online child sexual abuse and exploitation   💻 What impact this technology has on the lives of children   💪 The power of your support: Find out how your involvement can extend the reach of Safer Predict and drive further innovations to protect children Don’t miss this opportunity to join the conversation and understand how Safer Predict empowers platforms to combat online child sexual abuse and exploitation. Tune in and be a part of the solution.

    How Thorn’s AI-Driven Tech Protects Children

    How Thorn’s AI-Driven Tech Protects Children

    www.linkedin.com

  • View organization page for Thorn, graphic

    31,331 followers

    Do you know a teenager who's passionate about creating a safer digital world? This opportunity is for them! Applications are open until August 12 for our NoFiltr Youth Innovation Council, where a cohort of young visionaries are shaping the future of online safety.

  • View organization page for Thorn, graphic

    31,331 followers

    Have you ever wondered how your email knows which messages are spam? That’s the magic of classifiers at work. Classifiers are powerful algorithms that use machine learning to sort data into categories automatically. At Thorn, we’re leveraging this technology to tackle a far more critical issue: child sexual abuse material (CSAM) and child sexual exploitation (CSE). Our Safer Predict solution is a groundbreaking tool that uses machine learning models to identify new or unknown CSAM in both images and videos as well as messages and conversations that indicate CSE Traditionally, identifying new CSAM and CSE has relied on manual processes, placing a heavy burden on human reviewers and user reports. Leveraging CSAM classifiers enables teams to work faster and more efficiently. This technology dramatically speeds up the process of finding and removing harmful content, which is crucial when new CSAM may indicate a child currently being abused or when sexually; exploitative conversations are beginning to occur with a child. By reducing the time to identify victims, we can intervene faster and protect more children from ongoing harm. How do you see the role of machine learning and AI evolving in the fight against online exploitation? Let us know your thoughts in the comments below.

    • No alternative text description for this image
  • View organization page for Thorn, graphic

    31,331 followers

    What is sextortion? At times, perpetrators will use the threat of spreading a victim’s intimate imagery to extort additional actions. This is called sextortion. Sextortion – a word combined from the words sex and extortion – is when someone’s sexual imagery is used to extort them. Put simply, it’s when someone (a.k.a. A sextortionist) blackmails or threatens to expose another person’s sexual imagery to make that person do something they don’t want to do, like send more compromising photos, maintain contact, or send money. Between 2019 and 2021, the number of reports involving sextortion more than doubled. Previously, the primary motive of offenders was to get more explicit images of a child, but in reports from early 2022, 79% of the offenders were seeking money. Get help: If you’re asked to share something that makes you uncomfortable, you have a right to say no, even if you already shared something with them before. Remember: THEY are the ones who are doing something wrong. Text “THORN” to 741741 to speak confidentially with a trained counselor. Learn more: https://lnkd.in/eRcuGrgw

  • View organization page for Thorn, graphic

    31,331 followers

    Are you considering giving your child their first device? Remember, it’s not just about handing over a gadget – it’s about understanding what this new responsibility means for both them and you. Pro tips 👇 ✅ Spend time online with your child to understand their habits and interests. ✅ Teach them how to stay safe, both when you’re supervising and when they’re on their own. Understanding their motivations helps you lay the foundation for responsible use and safety. How do you talk to your kids about what having their own device means? Ready to start the conversation? Learn more about fostering a safe and healthy online experience for your kids: https://lnkd.in/g-kGDfWN

    Device Access Guide

    Device Access Guide

    info.thorn.org

  • View organization page for Thorn, graphic

    31,331 followers

    When a social media platform discovers child sexual abuse material (CSAM) circulating on its site, its team is suddenly faced with a daunting task: sifting through tons of digital evidence — sometimes millions of files — to find clues that could help identify the child victims.  These forensic reviews can take weeks, even months – delaying investigations and the safety of children. Until now. Thorn’s CSAM Classifier plays a critical role in speeding up these investigations. Using state-of-the-art machine learning, the classifier automatically identifies which files are likely to be CSAM and categorizes them for officers. By processing more files faster than a human alone could do manually — often within mere hours — the classifier accelerates officers’ abilities to solve such cases. Read how Thorn is helping investigators find children faster: https://lnkd.in/e38qeMVJ

  • View organization page for Thorn, graphic

    31,331 followers

    Parents often have misconceptions about the ages at which children are sharing nudes. Plus, many people lack confidence in speaking with their child about these topics. Thorn found fewer than 1 in 3 parents have talked to their children about SG-CSAM (self-generated child sexual abuse material, or “nude selfies”). Building knowledge and resilience is crucial in preventing harmful sexual encounters before they happen. When do you think is the right age to start conversations about online safety and sharing personal images? Share your thoughts in the comments below.

    • No alternative text description for this image

Affiliated pages

Similar pages

Browse jobs

Funding

Thorn 2 total rounds

Last Round

Grant

US$ 345.0K

See more info on crunchbase