This season, a new supporting character showed up on Bravo’s most unpleasant reality show, “Southern Charm.” In an early episode, one of the leads, Craig, said he uses ChatGPT for everything. When he needed to come up with costume idea for a literary character-themed party, he didn’t ask the friend he was filming with — he broke out his phone and asked his ChatGPT app by voice, not text, what he should dress as. (It suggested Jay Gatsby and Dorian Gray. Craig did not know who Dorian Gray was.)
Later in the season, he told a castmate, “I had a really good therapy session with ChatGPT the other day. I figured out why I snapped at Austen.” (Long, boring story.) “ChatGPT was like, ‘You just have to be patient with yourself,’” he said. “I cried, dude. I f****** cried talking to my phone.” He reiterated the sentiments moments later, saying, “I cried talking to a f****** robot.” This wasn’t even his first time talking about it — last summer, he told a podcast that he uses ChatGPT as a therapist while driving.
I thought about that while editing a First Opinion piece published this week. Marc Augustin, a German psychiatrist and psychotherapist, warned that if text-based generative AI can increase the risk of some mental health problems, the rise of voice-based AI will be even worse.
“The primary way humans communicate with AI is moving from typing and reading to speaking and listening. For most users, this will feel like a convenience. For vulnerable people — those prone to psychosis, mania, depression, or loneliness — it may represent a serious and unexamined risk,” Augustin writes.
Maybe someone should tell Craig.
Recommendation of the week: The documentary “Caterpillar,” now on Netflix, is one of the more viscerally upsetting movies I’ve watched. It follows an American man who travels to India for surgery to change the color of his eyes. Filmed in 2019, it raises thorny questions about medical tourism, beauty standards, and whether people should have the right to make bad decisions about their health.