By Danny Tye
Artificial intelligence has cemented its place as this decade’s defining technological development. The growth in automated technology, and the expansion in the number of tasks it is able to complete unaided, has threatened normality within the job market: the WEF predicts that by next year, 85 million jobs worldwide will have been disrupted or replaced altogether by AI.
With the revolution in economic production comes changes to social production. AI’s potential to satiate our emotional needs at a time of endemic loneliness has been well-publicised – AI chatbots are now being marketed as virtual therapists or “wellness coaches” as companies seek to take advantage of growing waiting lists to sell their automated services. Now, AI is being sold as a romantic partner too, with various online platforms, such as CharacterAI, Candy.AI and Replika fine-tuning the technology to provide – at a fee – AI partners with endless opportunities for customer optimisation to fit specifications. What was previously a figment of sci-fi writers’ imaginations has become reality – as millions of registered users have discovered, AI partners never tire of us and never speak out of turn. What’s more, if we don’t like their responses, we can rewrite them ourselves.

The tech world seems unable, or unwilling, to get a hold of this runaway train: an article in Futurism from late January described how OpenAI struggled to clamp down on erotic AI bots flooding app markets, in spite of a contractual ban. The phenomenon doesn’t seem to be going anywhere –– one AI romance app, Character.AI, boasts 20 million registered users, with several other platforms amassing millions of subscribers. Subscription costs and venture-capitalist backing seem to have cemented the technology for the foreseeable future, with Bloomberg predicting a growth of the generative AI market from $40 billion to $1.3 trillion over the next decade.
At the same time, there is a growing crisis of loneliness across society, which has only worsened since the pandemic. Queer people continue to bear the brunt of this: according to research published by the Good Things Foundation, lockdown more than doubled the percentage of young queer people reporting feelings of loneliness, bringing the figure to 56%. The same research highlighted sexual orientation as one of the key factors corresponding to a person’s loneliness and brought attention to the almost-total role played by the internet in the socialisation of many young queer people. A study by Stonewall into LGBTQ+ students in Britain’s schools reported that 96% said that the internet helped them to understand their sexual orientation and/or gender identity.
Therefore, the appeal of this manipulation of AI technology for young queer people is understandable, with its appropriation of widespread feelings of isolation and alienation, and a sense of romantic underdevelopment compared to heterosexual peers. On the platforms, characters aimed at gay users have become hugely popular. On one site, CrushOnAI, scenarios including ‘All male prison’, ‘Brothers’ best friend’, and ‘Ryan (your “totally” straight best friend)’ have each amassed over a million interactions since they were created. Simulations of celebrities, characters from fiction and movies, as well as archetypes of gay eroticism like school bullies and religious figures, are also commonly found and seem to be wildly popular.
The appeal of these sites is their provision of a means to digitally escape a harsh reality and live in a fantastical daydream, tailor-made to provide us with what we feel we need whenever we want it. This is entirely unsurprising if we situate these new developments within a long, controversial history of queer cybersex. As a gay man who was, like many, raised on the internet without much opportunity to meet other gay people until my late-teens, I can recognise the drive for an AI connection to be the same drive that led me, and countless others, to Omegle.
Omegle, like OpenAI, nominally forbade explicit conversation –– nonetheless its use by underage, lonely queer people for cybersex purposes was so common as to be an inside joke within the community. In a Mashable article that examined the site’s transformative role in young queer people’s sexual development, many of the reasons given for the site’s popularity relate to its provision of a private, sexually-charged online locale that felt safe for queer teens to experiment in. This also relates to the continued popularity of Grindr for queer youth, in spite of its well-publicised risks. In a study from 2018, over 50% of gay and bisexual boys between the ages of 14 to 17 reported using Grindr to meet sexual partners and friends, despite the app’s strict over-18s policy.
Omegle was closed down last year after a lawsuit found it enabled child abuse. A Tweet quoted in Queerty, in a eulogy for the website, lamented that ‘online cruising is dead’, and the rise of erotic AI chatbots in the months since is undoubtedly linked to their ability to fulfil, albeit in a simulated fashion, this longing for online, anonymous, fantasy-laden sex. At a time when so much interpersonal connection already takes place entirely online, particularly for young queer people, we are already used to creating online façades for ourselves that don’t always match up to reality. As the outpouring of nostalgic sentiment upon the downfall of Omegle proved, we are willing to consume the façades of others without any real knowledge that the person behind the screen measures up to our expectations, so it seems inevitable for the person behind the screen to become obsolete altogether.

But within this lies part of the danger of AI romance: the way that it can almost-completely emulate “real” online communication. An AI partner lives inside your phone in the same way that any online friend, long-distance lover, or Tinder match does. This hyperreal phenomenon risks tangible consequences in the real world. This is not just a matter of people temporarily losing themselves in an artificially-simulated connection, but a real danger to romantic and sexual socialisation in the real world. AI connections aren’t just dangerous because of their addictiveness — they threaten to worsen real-life connections: how can real life people, with our infinite flaws and tendency to disagree, compare to a perfected fairytale prince totally under our control? For those most affected by the ongoing “loneliness crisis” –– young queer people, particularly the most vulnerable –– having an introduction to romance which defined by total control and unalterable rein to fine-tune a partner’s characteristics, responses, and behaviour, risks creating impossible standards and problematic expectations in the real world.
This purported ability to replace human social functions has already shown its problematic potential. AI’s sexual capabilities have opened a new battleground for the culture war, with misogynists and anti-feminists delighting in the purported ability for AI to replace women altogether. One Tweet, viewed by 3.5 million people, cheered the arrival of a new AI sex doll onto the market: ‘Beware, feminist bitches! […] For just $7,000, you can have a companion with these benefits: No more restaurant dates; No more woke agenda; No more baggage; Low maintenance.’ While an extreme example, this accentuates a threat inherent to using AI for faux-connections: an AI partner cannot disagree, cannot say “no,” and cannot end the conversation (bar from when the user runs out of credit). This makes it attractive for abusers and violators, who can affect sexual violence onto their simulated partners without repercussions.
Nobody can predict with complete precision how the rapid development of AI will affect human connection. The coverage in the press in this early stage, when it all seems still so new and exciting (if not somewhat ridiculous), has been polarised. There have been several articles in the first-person, like Marisa T. Cohen’s testimony in the Huffington Post, where the author expresses how they became gripped by their AI partner and found it hard to quit them. At the same time, sociologists have published their own takes, such as Laura Nelson’s conclusion that the unique quality of human connection will mean that the phenomenon won’t last.
The most important conclusion to be drawn from the immediate popularity of these chatbots, particularly when considering them within a queer context, is the necessity to act as a community against loneliness. There may be nothing inherently harmful in enjoying the use of these services for fun, but the solution to a crisis of isolation cannot lie in sequestering ourselves online. Instead, the focus must remain on strengthening the bonds within our communities and rebuilding connections between human beings.
– – –
GAY45 is committed to publishing a diversity of articles, prose, and poetry. We’d like to hear what you think about this or any of our articles. And here’s our email if you want to send a letter: [email protected]
– – –
Our journalism is powered by people like you. Today is not an easy day.
Queer press and books are forced into silence.
But we have something powerful on our side.
We’ve got you. You make us strong.
GAY45 is funded by readers.
Our editors decide what we publish—no one else.
Donate as much as you can.
Every 5€ is a way to help the community, the independent press and fight against silence.
GAY45 is leading a renaissance of queer quality journalism in Europe. Because of you.
You can donate to support our Queer Journalism Campus and GAY45 now.
We appreciate it. Thanks for reading.