This article is part of the weekly technology newsletter, sent every Friday. If you want to sign up to receive it, You can do it at this link.
Olivia Tai has a degree in psychology from Harvard University and holds sessions in New York and San Francisco to better understand how everyone relates to artificial intelligence (AI). A few days ago he brought one to the Mozilla Festival held in Barcelona entitled “Your brother-in-law ChatGPT: love in the time of AI”. Tai’s goal is to share something invisible: each individual’s personal and secret use with their own chatbots favorite.
Most of this usage is trivial or useful, but the variety is enormous. In his sessions, Tai shares sheets with 14 different categories: from “informal conversations” to “exploring taboos” or “existential questions”.
At the basis of Tai’s profound work are two reflections regarding our relationship with technology: the privacy of our personal relationship with ChatGPT and the importance of intimacy today.
1. It’s harder to know what’s happening on ChatGPT than on Instagram
“10 years ago, if we were gathered here to talk about a new Instagram feature, we would all know what it was. Today, everyone uses ChatGPT in a different way,” Tai said. That’s why he does these sessions. It is important to share personal details because we don’t know what happens in other ChatGPTs.
2. Let’s enter the economy of intimacy
AI will take us into another economy, and intimacy is the most appropriate name, says Tai: “When attention began to be seen as a resource that we could capture, the attention economy was built. Now, by taking away people’s time with their friends, they feel lonely. The social fabric is falling apart. Everyone feels socially connected on Instagram, but they don’t feel socially fulfilled.”
The privacy economy is coming: “If we lived in a healthy society, we couldn’t create the privacy economy. Nobody would buy those services,” Tai says. If we didn’t have Instagram, AI friends wouldn’t work. If we didn’t have chats as part of our habit, chatbots of artificial intelligence wouldn’t work. They build a habit of how we interact with technology so that the next thing can come.
But the session with Tai served to uncover other secrets.
3. What is your unspeakable use?
To know the limits of individual use, Tai asks for the most unspeakable use anonymously in his sessions: there are those who upload all their boss’s emails to their artificial intelligence to know how to treat them better. There are also those who upload their photos or all their blood tests.
Tai hasn’t seen very strange things about love, which doesn’t mean they aren’t there: “There must be many more cases in the United States because we experience this loneliness epidemic more. With AI, there are two types of intense and harmful phenomena for a small population: AI-induced psychosis or love with an AI. These are strong things that make people wonder what’s going on.”
4. “I really don’t think I should do this.”
Beyond the extreme cases, there are others that are apparently more normal. But they remain doubtful, leave a bad impression, and lead their users to wonder whether they should do it with an AI.
Tai connects them partly to what we do in the solitude of the night. “We feel an impulse, curiosity, we want to know what he’s going to say,” Tai says.
What’s new there? There are things we do with AI that have never been possible before. It’s one thing to want the social validation of a friend or ChatGPT if you’re shopping for shoes. But then there’s the possibility of validating someone entirely as a person: “If I upload my entire diary to an AI, that’s something a friend can’t do,” Tai says.
5. “Tell me who I am deep down”
There we enter undefined terrain: is it a second mind? Is this some kind of new memory bank with all my photos and memories? But at the same time it’s not the same as opening a photo album or rereading your diary. It’s another thing: “It’s like I’m organizing my thoughts and experiences, making sense of them, asking myself who I am. And many people use artificial intelligence for this, to ask themselves who I am: what do I need to understand about myself? Who am I professionally? Where should I go?” Tai says.
Maybe it’s just a digital internal monologue, nothing more. Or maybe not: “Sometimes I use it for that,” Tai says. “It’s difficult to talk about the experience of others. If you’re a very mental person, who thinks too much, you can ask the AI a question and it answers you, but it just makes your head spin. You don’t really get to a solution, nor do you get clarity or simplicity about your question.”
In this personal exploration you can ask questions about very dark things, “about something very taboo that goes through your head or a radical political opinion that you wouldn’t say out loud,” Tai says.
6. It’s a short step from there to creating a relationship
There are many types of relationships. The more extreme ones usually appear in headlines: “Users deceiving themselves chatbots They end up living in a “cult for two” is one of this week.
But you don’t need to go that far. There are people who already feel jealous: “There are people who have told me, ‘Yes, I have a partner, but my partner spends the day talking to an artificial intelligence and that makes me a little jealous,’ says Tai. Or they directly consider that their relationship with their partner and their chatbots She is polyamorous.
These are just steps in an obvious direction: “I think there will be a presence of AI on tables at celebrations like Thanksgiving or Christmas,” says Tai. Now artificial intelligence has no social context: “They are not very good because, for example, Replika (a chatbots focused on personal conversations) flirts with anyone” in a social situation, Tai says.
But AI will soon know in a social context who is who and who went to dinner or a party with. Now you can put your phone down as another member at the Christmas table, but you don’t understand the context or who you’re talking to. But it will soon be possible. So, at home, you will not discuss the move with your partner, but with your AI. Or with both.
7. Teens are the frog of ChatGPT
Tai believes older people don’t have much to fear. There will be extreme cases, but they will be few. They have had years of relationships, disappointments and distance with a chatbots. The big question for her is what will happen in the medium term. Only two people can have answers, Tai says: therapists and teachers.
Therapists may reveal particular cases, but teachers are the first to see possible massive cases: “In student surveys, 18-year-olds say they don’t see themselves in a relationship with an AI. But younger people, those who have never kissed and only have questions and fears, may face a different relationship with their AIs.”
It’s a concept that comes from ecology, Tai explains: “When there’s a chemical that creates a toxin in an ecosystem, there’s always a very sensitive species, like a frog. That frog will notice all the changes first, so we study the frog,” he says.
Those frogs are, he thinks, teenagers. There is one sector in this society that will not leave them alone: corporations. It’s already there chatbots for teens who focus on high school gossip, like Tolan. You speak for yourself chatbots about what happened in class. So that chatbots You can go and discuss it with your friend, or better yet, with him. chatbots of your friend And then there are four of you talking instead of two.
