On the latest episode of #YourUndividedAttention, Aza Raskin talks with neuroscientist Anil Seth about the right and wrong ways to think about #AI consciousness. https://bit.ly/4cN7KOF
Center for Humane Technology
Non-profit Organizations
San Francisco, California 55,916 followers
Driving a shift toward humane technology that supports our well-being, democracy, and shared reality.
About us
Together with our partners, the Center for Humane Technology (CHT) is dedicated to radically reimagining our digital infrastructure. Our mission is to drive a comprehensive shift toward humane technology that supports our well-being, democracy, and shared information environment. From the dinner table to the corner office to the halls of government, our work mobilizes millions of advocates, technologists, business leaders, and policymakers through media campaigns, working groups, and high-level briefings. Our journey began in 2013 when Tristan Harris, then a Google Design Ethicist, created the viral presentation, “A Call to Minimize Distraction & Respect Users’ Attention.” The presentation, followed by two TED talks and a 60 Minutes interview, sparked the Time Well Spent movement and laid the groundwork for the founding of the Center for Humane Technology (CHT) as an independent 501c3 nonprofit in 2018. CHT was thrilled to support the development and release of the record-shattering Netflix documentary 'The Social Dilemma' from director Jeff Orlowski on the existential threats posed by social media, reaching an estimated 100 million people.
- Website
-
http://humanetech.com
External link for Center for Humane Technology
- Industry
- Non-profit Organizations
- Company size
- 11-50 employees
- Headquarters
- San Francisco, California
- Type
- Nonprofit
- Founded
- 2018
- Specialties
- Ethics, Technology, BuildHumaneTech, Human Behavior, Design, Tech, Social Media, Attention, Polarization, Mental Health, Innovation, Democracy, and AI
Locations
-
Primary
650 Townsend St
San Francisco, California, US
Employees at Center for Humane Technology
Updates
-
Questions of machine consciousness have dominated the public conversation around #AI. But are we asking the right questions? Neuroscientist Anil Seth joins Aza Raskin on #YourUndividedAttention to discuss how to think about artificial consciousness. https://bit.ly/4cN7KOF
-
The work of Parents for Safe Online Spaces this #InternetSafetyMonth. Their journey, transforming personal grief into impactful action, is both inspiring and crucial in the growing advocacy for Congress to protect kids online. Read about their stories from The New York Times https://lnkd.in/eqEyiQ6q
-
Immigration lawyer Petra Molnar went to borderlands around the world to see how governments were using novel #AI and #surveillance tech in the global refugee crisis. In the latest #YourUndividedAttention, she talks with Tristan Harris and Aza Raskin about what she found. https://bit.ly/4cqmvGS
-
In response to the growing global refugee crisis, governments are turning to novel #AI and #surveillance technologies. Tristan Harris and Aza Raskin sit down with immigration lawyer Petra Molnar on #YourUndividedAttention to discuss how borderlands have become a proving ground for high-risk technology. https://bit.ly/4cqmvGS
-
This week, the United States Surgeon General published a piece in The New York Times calling for a warning label on social media products. This comes on the heels of his 2023 advisory about social media and kids' mental health, classifying it as a public health crisis and urging a holistic response by government, industry, and society. There is no one right way to solve a public health crisis. It requires multiple types of interventions to treat and prevent it. While warning labels help the public understand that social media platforms are unsafe for youth, there is an equal need to prevent risks online through design-based changes. During #InternetSafetyMonth, this call from the Surgeon General serves as a timely reminder of the urgent need for a comprehensive approach for stronger protections from social media harms. https://lnkd.in/g7wqnDFn
https://www.nytimes.com/2024/06/17/opinion/social-media-health-warning.html
https://www.nytimes.com
-
Last November, Meta whistleblower Arturo Béjar gave powerful testimony to Congress, offering an insider's view on how tech companies prioritize profits over online safety for kids and teens. As a father himself, Béjar was shaken by internal research of young people’s experience exposure to… 🌐 bullying 🌐 negative comparisons 🌐 …and sexual harassment on Meta’s platforms that went unaddressed despite repeated escalation. Béjar stated plainly: tech companies know design fixes are possible to make their products safer. But without realigning incentives away from engagement and profit, they'll choose the bottom line over wellbeing. Béjar’s insider perspective reignited urgent conversations on how government intervention is crucial to compel meaningful change. Listen to him speak about his decision to come forward and join the chorus of technologists advocating for platform’s accountability. https://lnkd.in/eQ4Kk3dV #InternetSafetyMonth
Meta's second whistleblower tells us why he came forward
https://spotify.com
-
During #InternetSafetyMonth, we're spotlighting two impactful reports on safety by design in action. Children and Screens: Institute of Digital Media and Child Development and LSE and 5Rights released reports highlighting how the UK's Age Appropriate Design Code and other regulations have driven major platforms to make hundreds of design changes to better protect kids online. Key takeaways: 🔹 Product-level fixes work - if they can be implemented in one region, they're scalable worldwide 🔹This serves as a model for proactive safety as US considers kids' online legislation 🔹 The burden doesn't have to be on parents & kids alone - platforms can create safer digital spaces by design These reports show a promising way for advocates and lawmakers to push for digital products to be safe from the start. Read the reports and share what you found most eye-opening or encouraging about the progress so far. ⬇️ 🔗 https://lnkd.in/gRT8Ubav 🔗 https://lnkd.in/e9J8v3gP
UK's Age-Appropriate Design Code Ushers in Nearly 100 Safe Digital Space Changes for Youth - Children and Screens
https://www.childrenandscreens.org
-
Whistleblower William Saunders quit over systemic issues at #OpenAI. Now he’s put his name to an open letter that proposes 4 principles to protect the right of industry insiders to warn the public about #AI risks. On Your Undivided Attention this week, Tristan Harris and Aza Raskin sit down with Saunders to discuss. #RightToWarn https://bit.ly/3Rhqn59
-
Whistleblower William Saunders quit over systemic issues at #OpenAI. Now he’s put his name to an open letter that proposes 4 principles to protect the right of industry insiders to warn the public about #AI risks. On Your Undivided Attention this week, Tristan Harris and Aza Raskin sit down with Saunders to discuss. #RightToWarn https://bit.ly/3Rhqn59