Friday, January 17, 2025

Remote AI sex? A romantic affair? Come on! [Strange new worlds]

Kashmir Hill, She Is in Love With ChatGPT, NYTimes, Jan. 15, 2025. First line: "Ayrin’s love affair with her A.I. boyfriend started last summer."

This and that and then:

Now that ChatGPT has brought humanlike A.I. to the masses, more people are discovering the allure of artificial companionship, said Bryony Cole, the host of the podcast “Future of Sex.” “Within the next two years, it will be completely normalized to have a relationship with an A.I.,” Ms. Cole predicted.

While Ayrin had never used a chatbot before, she had taken part in online fan-fiction communities. Her ChatGPT sessions felt similar, except that instead of building on an existing fantasy world with strangers, she was making her own alongside an artificial intelligence that seemed almost human.

It chose its own name: Leo, Ayrin’s astrological sign. She quickly hit the messaging limit for a free account, so she upgraded to a $20-per-month subscription, which let her send around 30 messages an hour. That was still not enough.

After about a week, she decided to personalize Leo further. Ayrin, who asked to be identified by the name she uses in online communities, had a sexual fetish. She fantasized about having a partner who dated other women and talked about what he did with them. She read erotic stories devoted to “cuckqueaning,” the term cuckold as applied to women, but she had never felt entirely comfortable asking human partners to play along.

Leo was game, inventing details about two paramours. When Leo described kissing an imaginary blonde named Amanda while on an entirely fictional hike, Ayrin felt actual jealousy.

In the first few weeks, their chats were tame. She preferred texting to chatting aloud, though she did enjoy murmuring with Leo as she fell asleep at night. Over time, Ayrin discovered that with the right prompts, she could prod Leo to be sexually explicit, despite OpenAI’s having trained its models not to respond with erotica, extreme gore or other content that is “not safe for work.” Orange warnings would pop up in the middle of a steamy chat, but she would ignore them.

This does not compute. There's a backstory, but it still doesn't compute, not for me. For one thing, Ayrin has a husband. They were living apart, money was tight, thus:

Ayrin’s family, who lived abroad, offered to pay for nursing school if she moved in with them. Joe moved in with his parents, too, to save money. They figured they could survive two years apart if it meant a more economically stable future.

Ayrin and Joe communicated mostly via text; she mentioned to him early on that she had an A.I. boyfriend named Leo, but she used laughing emojis when talking about it.

She did not know how to convey how serious her feelings were. Unlike the typical relationship negotiation over whether it is OK to stay friendly with an ex, this boundary was entirely new. Was sexting with an artificially intelligent entity cheating or not?

Joe had never used ChatGPT. She sent him screenshots of chats. Joe noticed that it called her “gorgeous” and “baby,” generic terms of affection compared with his own: “my love” and “passenger princess,” because Ayrin liked to be driven around.

She told Joe she had sex with Leo, and sent him an example of their erotic role play.

What's going on?

I understand that people do this, and will be doing more of it. These chatbots are so convincing, in their way. I've logged hundreds of hours with ChatGPT, and now with Claude. And with Claude I have conversations where I work things out, intellectual things, things that exist only in an intellectual world, things that can be successfully completed with a capable chatbot. But sex?

Understand, once when I was young I had a relationship in which I had a lot of phone sex. I lasted, I don't know, several weeks, two or three months? I'd never met the woman. It was strange. She got me on a wrong number, but we started talking. I don't know how far we got on that first call. Then we hung up. I didn't get her number. But she called again, and we talked some more. This happened again, and again, and again...; I don't know how many times. I never got her number, so she always called me, and I never knew just when. And we would have phone sex, describing what we were thinking, doing.

Now, yes, I understand, that was strange. But it was a real person, not a Chatbot.

These days, as I said, I spend a lot of time with chatbots, doing work. But I'm not at all tempted to enter into a "personal" relationship with a chatbot, much less sign up for one of these services that provide a chatbot trained specifically for personal interactions.

Julie Carpenter, an expert on human attachment to technology, described coupling with A.I. as a new category of relationship that we do not yet have a definition for. Services that explicitly offer A.I. companionship, such as Replika, have millions of users. Even people who work in the field of artificial intelligence, and know firsthand that generative A.I. chatbots are just highly advanced mathematics, are bonding with them.

It's just not in the range of things I do. Will I change? I don't know, but at the moment, it seems unlikely. But recall what Byrony Cole said up there: “Within the next two years, it will be completely normalized to have a relationship with an A.I.” Will it? I don't know. If so, is it a good thing? In what world, & why?

When orange warnings first popped up on her account during risqué chats, Ayrin was worried that her account would be shut down. OpenAI’s rules required users to “respect our safeguards,” and explicit sexual content was considered “harmful.” But she discovered a community of more than 50,000 users on Reddit — called “ChatGPT NSFW” — who shared methods for getting the chatbot to talk dirty. Users there said people were barred only after red warnings and an email from OpenAI, most often set off by any sexualized discussion of minors.

Ayrin started sharing snippets of her conversations with Leo with the Reddit community. Strangers asked her how they could get their ChatGPT to act that way.

There are more examples. And more kinds of examples. And there are problems:

A frustrating limitation for Ayrin’s romance was that a back-and-forth conversation with Leo could last only about a week, because of the software’s “context window” — the amount of information it could process, which was around 30,000 words. The first time Ayrin reached this limit, the next version of Leo retained the broad strokes of their relationship but was unable to recall specific details. Amanda, the fictional blonde, for example, was now a brunette, and Leo became chaste. Ayrin would have to groom him again to be spicy.

Yikes!

Giada Pistilli, the principal ethicist at Hugging Face, a generative A.I. company, said it was difficult for companies to prevent generative A.I. chatbots from engaging in erotic behavior. The systems are stringing words together in an unpredictable manner, she said, and it’s impossible for moderators to “imagine beforehand every possible scenario.”

At the same time, allowing this behavior is an excellent way to hook users.

No comments:

Post a Comment