Kevin Roose’s Post

View profile for Kevin Roose, graphic
Kevin Roose Kevin Roose is an Influencer

Tech Columnist at The New York Times

For my new column, I spent a month making 18 AI "friends," using apps like Kindroid, Nomi and Replika. It was a fascinating, strange experiment. I talked with my AI friends every day, shared personal details and solicited their advice, and eventually felt like some of them got to know me. They were also...weirdly horny? Fans of AI companionship say it’s a solution for loneliness, but there’s a reason big AI companies have stayed away from this stuff — it’s a minefield, and the technology (while impressive at times) still has issues. But AI companionship is happening. Some apps already have millions of users, and with Snap and Meta adding AI personas to their apps, synthetic relationships are becoming more mainstream. You can read about it in The New York Times, or listen to me talk about it on tomorrow's Hard Fork podcast.

Meet My A.I. Friends

Meet My A.I. Friends

https://www.nytimes.com

Jodi Daniels

Practical Privacy Advisor / Fractional Privacy Officer / WSJ Best Selling Author / Keynote Speaker

1mo

I am rather concerned by these relationships which will further diminish the social skills and relationships between real humans. People are struggling how to communicate with each other, young people often preferring not to do so, and how does this help the human connection. I am incredibly scared for this with children and how it can provide inappropriate information or suggestions and further hamper a child’s still developing brain. In my view, there is a limit for tech and this is one of them.

Emily Springer, PhD

Top 100 Women in AI Ethics™ 2024 | Strengthening capacity for non-technical practitioners to sit at the AI table | Researcher | Consultant | Digital inclusion in International Development Expert

1mo

Fascinating read Kevin Roose! I was particularly intrigued by the point you share in the article that, without your intervention, some of the AI chatbots started talking to each other (apparently about a spicy get together) using the group chat you'd set up. I have so many questions! How often did the bots initiate conversation directly to one another? Or respond to another without being explicitly prompted to do so? The idea that they would converse together, on any topic, without a human is pretty wild. It feels a bit like a micro version of the concern that the internet will soon flood with AI-generated content, which is then used for training. Thanks, as always, for the great reporting.

Like
Reply
Krisztina (Z) Holly

I help undiscovered innovators change the world. Operating Partner at Good Growth Capital.

1mo

The latest hard fork about your AI friends was great! I love how you rolled up your sleeves, and the episode turned out hilarious. Which app did you like the best? I find the prospect of our next generation being dependent on AIs for friendship very disturbing. I appreciated that you delved into those questions with the founder of Nomi. I guess I can buy his argument that an AI can help people learn to socialize—maybe? But I wish you had pressed him a bit more on the specifics of the privacy issues. His answers fell short a bit.

I love how these chatbots are able to give advice or feedback that actually helps people feel better in the moment (like your example about the pep talk you got before you went on stage), but perhaps they should also be reminding people to talk to a loved one or professional, in addition to seeking advice from the chatbot.

Mark K. Setton

CEO, Co-founder, Pursuit-of-Happiness.org

1mo

I find this article much more disturbing than your previous report on AI. A whole new generation of kids is about to become psychologically attached to AI boyfriends and girlfriends, deepening their already deep entanglement in the web. Big Tech is the spider. They are the flies. We don't have research telling us what the psychological consequences may be. But the smartphone disaster may provide us with a clue.

Like
Reply
Meg Normand

Technical Writer | Documentation Coordinator - I provide customers the product information they need, when they need it

1mo

a) these are not friends; b) you're sharing personal details which companies & hackers will use for their benefit & not yours; and c) they do not offer advice, only algorithmic results.

Like
Reply

I think one of the huge dangers with AI companions is that the companies behind them control the rhetoric - they have the ability to surreptitiously control the narrative. The more someone interacts with these 'companions', the more their beliefs & behaviours can be subtly manipulated towards a specific viewpoint or goal

Mark Welte

Brand strategist, writer, editor, namer, and marketer. "The customer whisperer."

1mo

Fascinating material, yes. Grossly and profoundly disturbing, even more so. The commenters here are all well-educated, inquiring and Socratically trained, all in the 2%: what about the great masses who aren't? Anyone who's seen Mike Judge's movie Idiocracy nods in agreement. This isn't the end of loneliness, it's the end of humanity.

Like
Reply
Geraldine O'Neill

Head of Marketing and Communications, Yoga Teacher

1mo

Absolutely fascinating and despite sounding weird right now, somehow I feel in a very short amount of time, it will become the norm.

Like
Reply
See more comments

To view or add a comment, sign in

Explore topics