A single word from São Paulo’s rap scene triggered a digital firestorm that consumed the lives of trans activists for forty-eight hours in May 2024. Their response: turn the hatred into evidence, build an AI to track it, and sue the platforms that amplified it.

On certain afternoons in São Paulo’s periphery, language bends to accommodate lives that official categories cannot hold. One such word emerged from the city’s rap battles: boyceta—a term coined by those who reject the narrow corridor between masculine and feminine, who claim space in between. It was this word, offered casually during a podcast conversation in May 2024, that would trigger what its speakers later described as a coordinated digital siege.
The podcast episode seemed unremarkable at first: three friends discussing identity over microphones, the kind of intimate exchange that had become the programme’s signature. Veronyka Gimenes and Amanda Claro, directors of Código Não Binário, the advocacy organisation that produces Entre Amigues, were interviewing the rapper Jupitter Pimentel. When Pimentel described himself as boyceta, he was speaking a vernacular truth—one understood within certain communities but opaque to those outside them. Within hours, a fragment of that conversation had escaped its intended audience entirely.
The clip moved with the peculiar velocity of outrage, amplified first on X, then replicated across TikTok, YouTube, and Instagram. For forty-eight hours, the term became Brazil’s most discussed topic, its meaning distorted through thousands of hostile interpretations. What had been a moment of self-definition became something else: evidence, in the eyes of many commenters, of absurdity requiring ridicule. The staff at Código Não Binário attempted to monitor the cascade of responses but found themselves overwhelmed. The messages multiplied faster than any human team could track, their tone ranging from mockery to explicit threat.
Two years later, in February 2026, the organisation unveiled its response. It arrived in two forms: an artificial intelligence system named TybyrIA, designed to identify and catalogue anti-LGBT+ hate speech at scale, and a sweeping legal action filed through Brazil’s Ação Civil Pública mechanism—a form of collective lawsuit intended to defend rights too diffuse to pursue individually. The defendants were four corporations whose platforms had borne the brunt of the viral storm: Meta, which operates Facebook, Instagram, and WhatsApp; Google, the parent of YouTube; X; and ByteDance, TikTok’s Chinese owner.
The lawsuit’s central contention challenges how we understand platforms’ relationship to the content they host. Rather than treating these companies as neutral conduits, the complaint argues they function as active distributors—systems that do not merely permit speech but select it, amplify it, arrange it for maximum engagement. The organisation seeks both financial damages, reported by Brazilian media to approach one hundred million reais, and structural changes to how platforms moderate content, particularly content targeting marginalised communities.
The project’s name reaches back through four centuries of violence. Tybyra do Maranhão appears in colonial records from 1614, when French Capuchin missionaries documented the execution of a Tupinambá person for what they termed sodomy. The name itself derives from a Tupi word denoting those who loved others of their own sex. According to the friar Yves d’Evreux’s account, Tybyra was tied to a cannon at the Fort of São Luís and shot before assembled indigenous leaders—a pedagogical murder intended to demonstrate the new moral order being imposed on the land. By invoking this figure, Código Não Binário positions digital harassment within a longer genealogy, suggesting that algorithmic violence represents not rupture but continuation.
The work of transformation began pragmatically. Faced with thousands of abusive messages, the organisation developed a methodology for converting hostility into data. Staff members manually classified two thousand comments, creating categories for different forms of harassment. This initial taxonomy allowed an AI system to process the remainder—messages that would have been psychologically devastating to read individually but became, in aggregate, evidence. Gimenes has described this approach as a form of alchemy, transmuting the language of persecution into the language of litigation.
The research yielded patterns that academic studies had long suspected but rarely documented at this scale. Comments expressing outrage or contempt consistently attracted higher engagement rates than neutral or supportive responses. Platform algorithms, optimised to maximise time spent on-site, appeared to reward emotional intensity regardless of valence. This created what the organisation describes as a structural incentive towards extremity: users who wished to be heard learned, consciously or otherwise, that hostility travelled further than nuance.
The legal strategy rests on the Marco Civil da Internet, Brazil’s 2014 framework for online rights, enacted partly in response to revelations of NSA surveillance. That law established foundational principles—freedom of expression, privacy protections, network neutrality—but predated the full emergence of algorithm-driven content distribution. The question before Brazilian courts will be whether platforms can be held accountable not only for hosting harmful speech but for the systems that determine which speech reaches which audiences, and in what order.
The supporting coalition speaks to the breadth of communities targeted by online harassment. Alongside Código Não Binário stand IBRAT, the Brazilian Institute of Transmasculinities; Fonatrans, the National Forum of Black Trans Women and Men; and the feminist publication AzMina. Together, they frame the action as addressing systemic discrimination rather than isolated incidents—an attempt to establish that certain groups face not random hostility but coordinated campaigns enabled by platform design.
The platforms themselves maintain extensive community guidelines prohibiting hate speech and harassment. Each invests substantially in moderation infrastructure, employing thousands of reviewers and deploying increasingly sophisticated automated systems. Yet enforcement remains uneven, particularly in non-English contexts where training data for automated moderation is scarcer. Brazil, with its immense and highly active social media population, often reveals the limitations of systems designed primarily for North American and European users.
The lawsuit unfolds against a distinctive moment in platform governance. Across Europe and the Americas, legislators grapple with similar questions about algorithmic accountability. The European Union’s Digital Services Act imposes transparency requirements on recommendation systems. The United States sees proliferating state-level attempts to regulate content moderation. Brazil itself has emerged as particularly assertive, having recently imposed substantial fines on X for refusing to block accounts accused of spreading election misinformation.
Beyond its legal dimensions, TybyrIA represents an epistemological shift in how marginalised communities document persecution. LGBT+ activists have long maintained archives—handwritten logs of police raids during military dictatorships, databases tracking contemporary anti-trans violence. These archives serve dual purposes: historical memory and political leverage. Artificial intelligence extends this tradition, offering the capacity to capture harassment at a scale and speed that exceeds human observation.
There is also something more intimate at work. Members of Código Não Binário have spoken about the psychological toll of inhabiting hatred—of reading, day after day, messages wishing you erased. Converting those messages into structured evidence allowed a certain distance, transforming personal injury into collective testimony. The technology became not merely a tool for documentation but a strategy for survival, a means of metabolising violence without being consumed by it.
Brazilian courts will now weigh questions that extend far beyond this particular case. Can algorithmic systems be held responsible for patterns of amplification? How should liability be distributed across platforms used by billions worldwide? The timing is significant: similar cases proceed simultaneously in other jurisdictions, creating the possibility of converging legal standards or irreconcilable jurisdictional conflicts.
The outcome remains uncertain. Yet by reframing algorithmic amplification as actionable harm rather than an inevitable byproduct, Código Não Binário has introduced a new grammar for discussing platform responsibility. The organisation’s approach suggests that communities subjected to online hostility need not simply endure it or appeal to corporate discretion. They can document it systematically, analyse it rigorously, and demand accountability through law.
On the Maranhão coast, where Tybyra’s story entered colonial archives four centuries ago, the Atlantic continues its patient erosion. Historical memory in Brazil survives in fragments—scattered references, oral traditions, partial records. TybyrIA proposes a contemporary archive, one constructed from the digital traces of present-day prejudice and resistance alike. Whether that archive will reshape the global conversation about platform governance remains to be seen. But in converting the language of hatred into evidence and argument, it has established that the struggle for equality in the twenty-first century unfolds as much through data and algorithms as through physical spaces—that technology can be not only a site of violence but an instrument of accountability.
Subscribe to our newsletter here and support independent journalism.
Join 12,000+ readers who receive it every Wednesday, with exclusive content. For just €5,99
✦✦✦
If you have a tip and wish to contact us securely, you can write to [email protected], our encrypted email address. We take the protection of our sources seriously and guarantee strict confidentiality.
✦✦✦
When we learn of a mistake, we acknowledge it with a correction. If you spot an error, please let us know.
✦✦✦
You can listen to our podcast Queer News & Journalism on your favourite platform or go to our YouTube Channel @GAY45mag.
✦✦✦
Let us know what you think at [email protected].
✦✦✦
Submit a letter to the editor at [email protected].
✦✦✦
We appreciate it. Thanks for reading.


