By Sherry Turkle and Pat Pataranutaporn
Work for a Member company and need a Member Portal account? Register here with your company email address.
Nov. 8, 2024
By Sherry Turkle and Pat Pataranutaporn
---
Our new chatbots pose as confidants, lovers, psychotherapists and mentors. Their creators encourage us to believe these products have empathy, even love for us. More than 20 million people currently use Character.AI, a market leader in AI companionship. But a chatbot’s emotion is a performance of emotion. A chatbot is not, in fact, able to care for us. Presuming otherwise can be dangerous.
The Sewell Setzer tragedy has already inspired talk about AI “guardrails,” age requirements and parental signoffs for chatbots. Some are calling for better protocols for handling words and phrases that point to self-harm and ways to educate parents about the intimate, often sexual, nature of avatar gameplay.
These are worthy conversations, but they distract us from a more important truth: that artificial intimacy is no substitute for human connection. Chatbots don’t engage in relationships. They merely perform humanness.