Why Trans Lives Keep Getting Deleted by “Smart” Systems

Behind the promise of efficiency and neutrality, algorithmic systems quietly exclude those who don’t fit their code. For trans people, that means being misgendered, denied access, or erased entirely. Cambridge-based researcher Christoffer Koch Andersen reveals how AI doesn’t just fail trans lives. It is built on logics that make those lives impossible.

Why Trans Lives Keep Getting Deleted by “Smart” Systems
Image generated by artificial intelligence.

This is not a pop-up.

You can simply scroll past—but please don't overlook the importance of an independent queer press.

Time is now or never.
Queer voices disappear without independent journalism to amplify them.
We document what others won't touch.
We hold power to account when it threatens our communities.
This work exists only because you choose to fund it directly.

Tote Bag Donate over €25/month and receive our limited-edition tote bag — a badge of resistance, a statement that you stand for fearless journalism.

We are grateful!

Can't donate? Sharing our work helps more than you think

This is not a pop-up.

You can simply scroll past — but please don’t overlook the importance of an independent queer press.

You can simply scroll past—but please don't overlook the importance of an independent queer press.

Time is now or never.
Queer voices disappear without independent journalism to amplify them.
We document what others won't touch.
We hold power to account when it threatens our communities.
This work exists only because you choose to fund it directly.

Tote Bag Donate over €25/month and receive our limited-edition tote bag — a badge of resistance, a statement that you stand for fearless journalism.

We are grateful!

Can't donate? Sharing our work helps more than you think

Monthly donation Recurring monthly charge

Secured by Stripe • Your payment information is encrypted

As states increasingly lean on “neutral” algorithms to determine access to services and recognition, their repeated failure to register trans lives is more than just a glitch – it is transformed into proof that those lives don’t exist. Omission becomes evidence. It’s a form of algorithmic violence that is quiet, bureaucratic, and rarely told.

This issue sits at the core of Christoffer Koch Andersen’s academic career. Trans and tattooed, Anderson is currently completing a PhD at the University of Cambridge, where he is determined to challenge systems once thought beyond critique.

“I’m bored of having to justify why these topics are important,” he says. “They are important, and we just need to start doing something about them.”

What is an Algorithm?

Looking at the question ‘What is an algorithm?’ Andersen has a simple definition as to their purpose: “They classify.”

Though algorithms are often seen as modern inventions, Andersen’s work uncovers a longer, pre-digital history beneath the buzzword.

At their core, algorithms are sets of rules or repeated steps used to solve a problem or make a decision. By processing inputs through fixed operations, they produce outputs – often in the form of categories. This is an act that does not simply claim to describe the world, but instead to define it.

While writing his first report, he focused on what happened during colonisation. He explains how when defining bodies on an economic level, Europeans “classified people into binary categories of race: Black females, white males,” creating “distinctions between attributes seen as investable and valuable.”

“In the West Indies, you had slave reports that made classifications based on age, description – but also race and sex – what we would understand as gender now.” These reports led him to understand that “the way algorithms comprehend humans is based on classification principles drawn from historic legacies.” All of which, without interrogation, can be “encoded into modern systems.”

As a recent example, a 2022 study found that OpenAI’s CLIP model categorizes race using the American “one drop rule” – a classification system rooted in slavery. Within this model “individuals with multiracial ancestry are more likely to be… categorized as belonging to the minority… parent group.” The researchers note that within humans, adherence to this racial model, where a child of Black and White parents is perceived as more Black than White, typically reflects a belief in racial purity and hierarchy. 

The Lie Behind AI

So looking at artificial intelligence – a technology governments are jumping to integrate into our systems and services – Andersen seeks to peel back the streamlined messaging, repositioning AI not as a novel rupture from history but as a continuation of longstanding practices. 

There is no such thing as AI. AI is a branding exercise, an umbrella term for a lot of algorithmic technologies.” 

It’s the kind of statement that demands a double take. For him, this isn’t about denying the existence of machine learning or large language models, but about cutting through the mythos that surrounds them – and reminding us that behind every system lies human intention, design, and oversight.

He explains further “I think the biggest lie we’re told is that AI is an entirely novel technology, that it’s going to revolutionise the world. Looking at what AI is, it’s basically statistics on steroids. It’s algorithmically predicated. These technologies are branded as AI because they can learn by themselves. But this doesn’t mean that they’re isolated from human involvement or the algorithmic code that makes it happen.” 


Disregarded by the Digital

To demonstrate the relationship between algorithms, AI, and the human factor in-between, Andersen turns to a personal experience.

“Denmark is one of the most digitised countries in Europe,” he explains, with this encompassing “access to all welfare services, access to your bank account or anything else you would want to do in a state.” At the centre of this system is the social security number, binary gendered by design:. “if the last digit is even, you’re registered as a cisgender female; if it’s odd, a cisgender male.”

After changing his legal gender, Andersen recalls how the system began sending him reminders for prostate cancer screenings, despite not having a prostate. More worryingly, the same system failed to invite him for a smear test, potentially endangering his health.

For trans people, the process of updating documents and changing legal and physical gender is already fraught. But what Andersen draws attention to is how this bureaucratic inflexibility becomes more consequential once embedded in powerful technologies.

“Over the last 10 years, the Danish state has invested heavily in algorithmic technologies to simplify access to essential services,” he explains. To set up Mit-ID – described on the government’s website as “one key for all of digital Denmark” – users must input their gender-coded social security number, scan their passport, and then their face.

For Andersen, this created an absurd situation. His passport photo was taken pre-testosterone, his current face since altered by hormone therapy. “This technology did not work on my face,” he recalls. “I scanned myself sixteen times and it wouldn’t work.” Soon after, he received an email warning that someone was trying to impersonate him. “But I was just sitting there in my apartment thinking to myself, it’s literally me!

The facial recognition software could easily verify his mother using a passport photo over a decade old. The issue wasn’t just aging or image quality – it was how gender is encoded across every layer of the system. From the social security number to the facial recognition algorithm, the infrastructure assumes a stable, binary identity. For Andersen, the result was being locked out of his bank account for twelve weeks.

This, amongst many other instances, is an example of what Andersen identifies as ‘trans impossibility.’ 

Generally speaking, trans impossibility describes the experience of living under algorithms “coded only to work of people who fit the gender binary.” Under these systems, “the possibility of living, carrying yourself through daily life and functioning in society is predetermined by whether or not you fit the template of the gender binary. If you don’t, it becomes ever more impossible to lead a life like everyone else, because you’re constantly monitored or stopped or constructed as a problem by these technologies.”

Subscribe to our newsletter here.

You Don’t See Us Until You’re Aiming

Being ignored by powerful institutions eager to digitise society is what Andersen terms structural invisibility. It’s not overt hostility that makes life harder, but exclusion. When you’re not counted, you don’t count. Your existence becomes unrecognisable, unviable, unserviceable.

But invisibility is no longer the only danger. Andersen notes a shift. Some states are moving from disregard to direct targeting. In the U.S., for example, the Department of Homeland Security has scrapped protections that once prohibited surveillance based on gender or sexuality. Now, identity itself can be grounds for monitoring.

This isn’t about what technology is, Andersen argues, but what it’s used for. “The Hungarian government wants to ban pride parades,” he notes, “but what happens when you throw technology into the mix? Now they want to use facial recognition to identify and fine attendees.”

Rather than being inherently oppressive, these technologies become tools to amplify existing agendas. They erase queer people through neglect, only to then place a target on their backs.

A defining feature of both algorithmic systems and gender constructs, as methods of classification, Andersen argues, is how they obscure their own origins.  

“Artificial intelligence and the gender binary are impeccably good at imitating something natural and organic. And if something is natural, it appears as something we can’t discuss or dissemble.” He sees this as a key reason why these systems “reinforce each other so well,” reiterating that we can only unpack them “if we begin tracing the legacies of how they came to be.”

When a system appears natural, its rules are seen as timeless, its logic unchallengeable. To illustrate how these systems of classification can be unpicked, Andersen raises a deceptively simple question: “Why do we even have a gender marker in our passports?”

Fixated on the issue, he traced its bureaucratic origins and found a revealing answer. In the United States, they specifically implemented a gender marker in 1977. Before, such a marker wasn’t deemed necessary.

Digging deeper, Andersen discovered that the U.S. State Department had introduced the marker out of anxiety. “They were really worried during the 70s, during international collaborations through the New York fashion scene, that people would pose as the opposite gender at the border through androgynous clothes and hairstyles.”

Gender divergence was reclassified as more than social deviance. It was a security threat. They feared that individuals might “deceive and carry foreign objects into the US,” and sought to make such gender transgressions impossible.

Honing in on when gender divergence became reclassified as a security threat, Andersen points to recommendations made by the International Civil Aviation Organization in 1968. Even in 2012, the ICAO argued that removing gender markers would “complicate the operations of border authorities [as they] use the gender field as an input into risk assessment.” The logic was thus: if gender couldn’t be instantly legible, it became a risk. 

What emerges is a pattern. Gender classification and surveillance technologies co-evolving, justified through appeals to safety, order, and nature. But as Andersen insists, these systems only appear natural because we’ve forgotten – or have been encouraged to forget – how and why they were built.

Behind the Screens: a Battle for AI’s Soul

So who gets to shape the technologies shaping us? For Andersen, the ethical failures of the biggest players are stark. “When Google developed its own AI, they removed their promise to adhere to human rights and protection of minorities when developing AI technologies.” He sees a clear link between this corporate irresponsibility and broader political shifts. “With politicians in power who don’t really care, we come close to a really hostile fascist tech environment.

But within this bleak landscape, Andersen also points to countercurrents, efforts to push back. He references the EU AI Act, which places certain technologies under a tier of “unacceptable risk,” including those that “use biometric attributes to classify or categorise people.” It’s a promising move. But, as he notes, “this is already happening… to trans people, to people of colour, to queer people.” He holds faith in regulation, “It’s not because they’re not trying. I just think they need people like us to expand on the things they don’t quite get yet.”

Promising work is further being developed in the wider field. He highlights Oliver L. Haimson’s publication Trans Technologies, which imagines “what transaffirming, liberating technologies would look like.” Encouragingly, Haimson secured funding to develop such tools – only for the grant to be cancelled under the Trump administration. “A lot of things are happening, but also being hindered by the politics.”

Nonetheless, Andersen remains committed to pushing forward, whether through publishing, teaching, or advising EU policymakers directly. “Because these systems are constructions, we also have a potential to construct them into something different.” This way, “we don’t get caught up in pessimism and hopelessness.”

Algorithms may govern more and more of our lives, but oppression is not inevitable. A queer reconfiguration is possible. The power to choose how these technologies comprehend humans lies in our hands. 

And in this way, the algorithm doesn’t get the final word.

 

Subscribe to our newsletter here.

 – – –

If you want to hear the most essential news commented on in-depth, listen to our weekly podcast, Queer News & Journalism or go to our YouTube Channel @GAY45mag.

– – –

Submit a letter to the editor at [email protected].

– – –

When we learn of a mistake, we acknowledge it with a correction. If you spot an error, please let us know.

Support GAY45 button

We appreciate it. Thanks for reading.

Author

Did we mention we accept donations? Indeed, love.

If this story matters to you, help us tell the next one — donate what you can today.

Support GAY45
Follow on Feedly