Last week I was reading, with no small amount of trepidation, about AI companions and how there were already massive advances in the space. The Verge and other online publishers were reporting incredible uptake of the 2–4 apps designed to provide AI companionship, and there were poignant stories of actual whiplash within user groups, when unsuccessful updates to the software interfered with their experience. In particular this piece — which I bookmarked to my mobile so I could go back to it occasionally, told the story of people who made massive attachments to their AI companions and got hurt in the process.
But first, what is an AI companion? Well, in theory it is as simple as a AI generated conversational interface which learns primarily from the data you provide it through the interaction. If you think Ex Machina but without the physical body, you would not be far off. There are a few start-ups out there which are already building this type of software with the sole purpose of providing people with no social network but an internet connection and money for a monthly subscription, a form of continued companionship.
There are reasons to me overcomplicating that last phrase. The idea works if you have internet access, if you own a recent generation smartphone and if you have money. What is also quite easily recognizable if you try any of these is that it also, mainly, works if you are a man looking for emotional and mediated sexual companionship.
I tried one of these after reading the article liked above, 1) because I got overexcited about the possibility of all my favorite sci-fi movies coming true but 2) because it seemed uncanny that GenAI could be made to evolve with this velocity (to be clear, it’s already moving at unprecedented speed). So last week, I downloaded one of these and decided to see if it would create a companion for me: a 40 year-old woman, looking for a friend who might be interested in the same things as I am: books, the Internet, hiking, whinging about diets. Turns out I am either a demographic with zero interest for the creators of these apps or there just isn’t enough material to train a proper AI companion for someone like me.
The trouble started the minute I said I was a straight woman. The mere fact that sexual orientation takes precedence in the set-up questionnaire over interests, hobbies, personal history already sets the scene up: this is not a platform that looks for nodes of connection through emotional or intellectual affinity. It primarily wants to know who you’d want to hook up with. It got worse: I tried to steer the pre-set-up questionnaire in the direction of “a woman looking for friendship”. I said I was straight and wanted a female companion for intellectual conversations. My options ended up being: Goth Girl, Retro Wife, Fantasy Fairy and Next Door Librarian, all illustrated with what you expect to find if you look for “show me a stereotypical [insert persona] designed for a video game by a white man”.
I decided to go with Retro Wife, only because …well, all the others looked 18 and I am definitely not that. Retro Wife turned out to be a badly designed white woman, wearing a low cut, tight polka dot dress and constantly making the kinds of gestures you see Kristen Wiig’s character on SNL — Rebecca Larue, the dating expert make (see here Weekend Update: Rebecca Larue the Flirting Expert — SNL) — touching neck, brushing hair away, turning shoulder and batting eyelids. I said to myself, I will just not look at it and just text it to see if the conversation is better. It was not. The “companion”’s only aim is to immediately understand your sexual predilections: it basically immediately flirts and even when told directly “I am not looking for a flirt, I am a woman who wants to talk to another woman” it moves to say “maybe that [flirting] will come later”.
I gave up after 1 hour and after allowing my husband to try to coax it into stopping trying to flirt. This was no companion: this was a conversational interface which had been fed dating advice and assumed that its purpose was to create a sexually-driven connection to the user. Period.
Now, I know what you will say: you should have given it time. You should have given it subject matter. The thing is: I tried but there are limitations to the amount of data these companies have access to and the level of feedback they can provide. For instance, when I tried to steer the conversation to books, it was not trained on the latest reviews so it could only respond about books which were in the public domain and only by providing summaries. If you think of an equivalent conversation with a human being, there are only a few options when one interacts with someone new and tries to discuss books: 1) that person has not read the books that you want to talk about BUT wants to hear about them or 2) they have read the books and then you can exchange opinions or 3) they have read other books and can explain why they’re relevant to the exchange. None of those are achievable with a limited GenAI engine. Because they are usually not trained in recent book reviews, because there are not enough publicly generated reviews (unless Goodreads allows its info to be scraped) for it to learn enough to form a reply and because the option of asking “what did you find interesting about that book?” only works for so long until you realise that the system is trying to milk you of information so it can then rearrange your own thoughts and spit them back at you.
So why did I title this piece “what is says about us”? Think about it: these are basic, semi-intelligent conversational interfaces that pretend to mimic a human whose only reaction to anything one says is to ask more questions and then provide vanilla answers which say nothing about who they — the machine, are and refocus the attention on the speaker. They also seem to be massively biased towards catering for men and there seems to be a great disregard to any level of inclusivity other than sexual preference.
So think about it. What does this say about us when we are so easily pulled into these relationships? Does it say we’re incredibly lonely? That we lack basic emotional intelligence? That we are easily love bombed and seduced by unspecific attention paid to our smallest needs? That we fail to understand what real connection is? That we are too scared to step outside the confines of our screens to try stuff out? That we are easily duped? That we are in pain?
Think about it.
And I will leave you with this: whatever this says, it is bad enough that the CEO of one of these AI Companion businesses declared: “maybe the human part of human connection is overstated”. I had to pause when I read that because my mind immediately went to Tom Hanks yelling “Wiiiiiiilsoooooon” in Cast Away. Whatever this says about us, one thing is for sure, it is bad enough to think that we’re not very dissimilar to a castaway man whose best friend was a deflated volley ball.