Jenna Dietz’s Post

View profile for Jenna Dietz, graphic

Trust & Safety Specialist at VSCO

So grateful to represent VSCO® at this event! Thank you to the Tech Coalition for organizing such an informative and impactful briefing on understanding the influence of generative AI on online child sexual exploitation. And thank you to Google's D.C. office for hosting!

View organization page for Tech Coalition, graphic

3,681 followers

Today we convened an industry briefing on the impact of generative AI on online child sexual exploitation and abuse (OCSEA). We brought together key U.S. stakeholders in the ecosystem to develop a shared understanding of the potential risks predatory actors pose to children through generative AI and the ways companies are currently addressing those threats, as well as to identify and initiate new opportunities for stakeholder collaboration. Reps from 27 of our Member companies, including Adobe, Amazon, Discord, Google, Meta, Microsoft, NAVER Z (ZEPETO), Niantic, Inc., OpenAI, Pinterest, Snap Inc., TikTok, VERISIGN, Verizon, VSCO®, Yahoo, and Zoom, joined select child safety experts, advocates, and members of law enforcement. As generative AI develops and the child safety ecosystem evolves, our Members are building a deeper understanding of the issues and challenges, so they can continue to be proactive in their efforts to reduce risk, incorporate safety by design, and innovate solutions to help keep children safe. Additionally, the tech industry and the stakeholders with whom industry engages to thwart OCSEA will continue to adapt their approaches and systems to address this new threat, as they have with past changes in technology. For this reason, today’s briefing culminated with several new multi-stakeholder efforts, among them including:  - Red teaming: With input from the U.S. Department of Justice, we will help companies explore ways to test for and mitigate OCSEA risks. - Information sharing: We will advance utilizing the Lantern program to securely share information that supports robust safety evaluations and mitigation methods for generative AI CSAM and related OCSEA incidents.    - Industry classification system: We will review and update the Industry Classification System to address different types of AI-generated OCSEA.  - Reporting: We will work with the National Center for Missing & Exploited Children (NCMEC) to help develop a process to efficiently and effectively refer cybertip reports of AI-generated OCSEA to NCMEC. Our work to understand the impact of generative AI on OCSEA began earlier this year when we started bringing Members together regularly to identify emerging challenges and share learnings. In addition, together with Thorn, we co-hosted a webinar to convene experts on the topic of understanding child safety risks with generative AI, and at the Crimes Against Children Conference we brought together industry to identify and address generative AI challenges. We look forward to continuing to facilitate discussions about OCSEA and the rapidly changing space of generative AI. See the full blog post to learn more: https://lnkd.in/er_9JkQR

Tech Coalition | Tech Coalition Hosts Generative AI Briefing for Key U.S. Stakeholders

Tech Coalition | Tech Coalition Hosts Generative AI Briefing for Key U.S. Stakeholders

technologycoalition.org

Patricia Cartes Andrés

Trust & Safety | Public Policy | Ethical AI | Marketplace Safety | Insurance Operations

7mo

I can't wait to hear your insights from the gathering. VSCO couldn't be better represented at the Tech Coalition 🔥

Like
Reply

To view or add a comment, sign in

Explore topics