The Addiction Machines

A twenty-year-old woman from Chico, California, just brought Big Tech to its tobacco moment

The Addiction Machines
Parents and family members of victims were at the court in LA to hear the verdict. Getty Images.
This is not a pop-up.

You can simply scroll past—but please don't overlook the importance of an independent queer press.

Time is now or never.
Queer voices disappear without independent journalism to amplify them.
We document what others won't touch.
We hold power to account when it threatens our communities.
This work exists only because you choose to fund it directly.

Tote Bag Donate over €25/month and receive our limited-edition tote bag — a badge of resistance, a statement that you stand for fearless journalism.

We are grateful!

Can't donate? Sharing our work helps more than you think

This is not a pop-up.

You can simply scroll past — but please don’t overlook the importance of an independent queer press.

You can simply scroll past—but please don't overlook the importance of an independent queer press.

Time is now or never.
Queer voices disappear without independent journalism to amplify them.
We document what others won't touch.
We hold power to account when it threatens our communities.
This work exists only because you choose to fund it directly.

Tote Bag Donate over €25/month and receive our limited-edition tote bag — a badge of resistance, a statement that you stand for fearless journalism.

We are grateful!

Can't donate? Sharing our work helps more than you think

Monthly donation Recurring monthly charge

Secured by Stripe • Your payment information is encrypted

They brought photographs. Not as exhibits — the exhibits would come later, projected on screens, extracted from corporate servers — but laminated snapshots of children held flat against chests, the way you might hold a prayer card or a passport at a border crossing. Parents had flown in from across the United States to sit in the public gallery of the Los Angeles Superior Court and watch. Some of them had lost children. Others had watched their children become unreachable, swallowed by something they could not name in language the law would recognise. They stood on the courthouse steps with hand-painted signs, and they waited for a jury to say what they had been saying, unheard, for years.

They were there because of a twenty-year-old woman from Chico, California, whom most of them had never met.

On 25 March 2026, after nine days of deliberation totalling nearly forty-four hours — at one point the jurors told Judge Carolyn B. Kuhl they were struggling to reach consensus on one of the two defendants — the panel of five men and seven women came back. Meta Platforms, parent company of Instagram, and Google’s YouTube: liable on every count. Negligent in design. Negligent in operation. Aware the design was dangerous. Failed to warn. Ten jurors for the plaintiff on every question. Two for the defence. As the verdict was read, the plaintiff, identified in filings as K.G.M. and known to her legal team as Kaley, looked straight ahead, stony-faced. Her lawyers shook their heads in quiet approval. Outside, the parents wept.

Three million dollars in compensatory damages, split seventy–thirty between Meta and YouTube. Then, hours later, the punitive damages came back: 2.1 million against Meta, 900,000 against YouTube. Six million in total. The jurors had found both companies acted with ‘malice, oppression, or fraud’.

Six million dollars. Meta is worth roughly 1.5 trillion. The company’s share price ticked up five per cent in after-hours trading the same evening, which tells you everything you need to know about how the markets read the penalty and nothing at all about why the penalty matters. Because the money was never the point. The point was something harder to price: the first time a jury anywhere had looked at a social media platform and called it, in law, a defective product. Not a publisher. Not a marketplace of ideas. A machine with a manufacturing flaw, built to do exactly what it did.

Laura Marquez-Garrett, attorney with the Social Media Victims Law Center and counsel of record for Kaley, had said during deliberations that the trial was ‘a vehicle, not an outcome’. Behind the vehicle: more than two thousand pending cases. Parents, school districts, state attorneys general, all waiting to see which way the road went.

Now they know.

I.

Kaley got YouTube when she was six. Instagram at nine. Musical.ly — which later rebranded as TikTok — at ten. Snapchat at eleven. Nobody stopped her. No platform asked her age, or if they did, the asking was a form so perfunctory a child could defeat it. She told the jury she was on social media ‘all day long’. She said that being logged off would make her feel she was going to ‘miss out on something’, and this would send her ‘into a panic’. She stopped doing things she used to like. She pulled away from her family. Making friends at school was difficult because the phone was always there, always more immediate. According to the Pew Research Centre, at least half of American teenagers use YouTube or Instagram daily. Kaley was not a teenager when she started. She was in Year 1.

By ten, she had uploaded more than three hundred videos to YouTube. (She thought it was two hundred. Her attorney corrected her gently on the stand.) The images she encountered on Instagram — filtered, retouched, performing a particular species of aspiration that the platform monetises with enormous efficiency — left her ‘feeling very depressed’, she testified, and chronically insecure about her own face. ‘I spent all my time on it,’ she told the court. ‘I would sneak it.’ The phrasing is worth pausing on. She was describing her relationship with a consumer product the way someone might describe a relationship with a substance.

Anxiety and depression arrived at ten. Body dysmorphia came later. Suicidal thoughts. She filed suit in 2023, at seventeen.

The defence tried to reroute the story. Attorneys for Meta and YouTube argued that Kaley’s psychological difficulties had their roots not in anything algorithmic but in a ‘turbulent home life’ — emotional troubles at home, learning difficulties, and family disruption documented across her medical records. They stressed, and this was their sharpest point, that not one of her therapists had ever identified social media as a factor in her struggles. The company’s logic was familiar enough to be almost comfortable: the product is fine, the problem is the user, the user was already broken. It is the same argument the tobacco industry ran for three decades, with diminishing returns and increasing cynicism, before the settlements caught up with them. But the plaintiffs here did not need to prove that Instagram caused Kaley’s mental health problems. They needed only to show it was a ‘substantial factor’ in causing harm. The gap between those two legal standards is narrow, but it was wide enough to drive a verdict through.

II.

Seven weeks. Jury selection started on 27 January 2026, testimony on 10 February. The courtroom heard from addiction experts, child psychologists, platform engineers, therapists, and three of the most powerful technology executives alive: Mark Zuckerberg, Meta’s chief executive; Adam Mosseri, who runs Instagram; and Cristos Goodrow, YouTube’s vice-president of engineering. Neal Mohan, YouTube’s chief executive, was not called. Snap and TikTok, both originally named as defendants, settled with the plaintiff before the trial began — Snap roughly a week before jury selection, TikTok on the actual day it started — for undisclosed amounts. Neither admitted wrongdoing, which is the standard corporate way of admitting you would rather pay than fight.

Zuckerberg testified on 18 February. His first time before a jury. He was shown internal Meta documents — not the kind that get leaked to journalists and can be dismissed as out-of-context, but the kind that get entered into a court record and read aloud under oath. One memo: ‘If we wanna win big with teens, we must bring them in as tweens.’ Internal research: eleven-year-olds were four times more likely to keep returning to Instagram than to competing apps, despite the platform’s stated minimum age of thirteen. And then, more damaging still, an internal communication whose candour had the quality of someone forgetting the room was wired: ‘Oh my gosh yall IG is a drug… Lol, I mean, all social media. We’re basically pushers.’

Zuckerberg, when pressed, said he ‘always wished’ for faster progress in identifying underage users. He said the company had reached the ‘right place over time’. When asked about safety, he offered a line his legal team presumably intended as reassuring: ‘If people feel like they’re not having a good experience, why would they keep using the product?’ It landed differently than intended. The answer to his rhetorical question was the entire basis of the lawsuit: they kept using it because the product was engineered so they could not stop.

The plaintiff’s lead trial attorney was Mark Lanier, a Texan who is also a part-time pastor and who has, over a thirty-year career, accumulated something close to twenty billion dollars in cumulative verdicts against corporations. The New York Times once described him as ‘one of the top civil trial lawyers in America’, which undersells it somewhat. In 2018, he secured 4.69 billion dollars from Johnson & Johnson on behalf of twenty-two women who said the company’s talcum powder, contaminated with asbestos, gave them ovarian cancer. He went on to lead the opioid litigation against CVS, Walgreens and Walmart — a 650.6 million dollar judgment. Lanier is what happens to a corporation when the regulators have gone home, and the courtroom is the last public health mechanism standing.

His style in Los Angeles was pure Lanier: part revival preacher, part forensic accountant. He unrolled a thirty-five-foot collage of selfies Kaley had posted to Instagram, hundreds of them, many shot through beauty filters, and let the jury look at it while Zuckerberg sat in the room. He called the tech companies ‘a lion stalking a pack of vulnerable gazelles’. He said YouTube had marketed itself to busy parents as a ‘digital babysitting service’. During the punitive damages phase, he held up a jar of M&Ms and told the jurors each one represented a billion dollars of the companies’ combined worth. Then he asked them to set a price on what had been done to Kaley. The showmanship was real, but it was load-bearing. Beneath it sat a structural legal argument that may prove more consequential than any dollar figure the jury could have chosen.

The argument was this: they were not suing over content. Jurors had been explicitly instructed to disregard the specific posts and videos Kaley had seen on the platforms. This was not a case about what Instagram showed her. It was a case about how Instagram worked. The infinite scroll. The algorithmic recommendation engine. Autoplay. The notification architecture. Beauty filters. The dopamine-calibrated feedback loops that the plaintiffs called, with calculated bluntness, a ‘digital casino’. The claim was product liability, not content moderation.

For decades, technology companies had wrapped themselves in Section 230 of the Communications Decency Act, the 1996 federal law that says internet platforms bear no legal responsibility for what their users post. It has been, until now, an almost impregnable shield. But by reframing the suit as a defective-design claim — by insisting that the damage was done not by the speech the platform carried but by the machine that carried it — the plaintiffs walked around Section 230 altogether. The question before the jury was not Did Instagram host harmful content? It was: Is Instagram a harmful product?

The jury said yes.

III.

Watch the executives resist the word ‘addiction’. This is where the tobacco parallel is not merely useful but structurally illuminating.

Mosseri, during his testimony, called the concept ‘problematic’. What he preferred was ‘problematic use’ — the concession that some users spend ‘too much time’ on the platform, wrapped in language that places the problem in the user’s behaviour rather than the product’s design. Goodrow, for YouTube, testified that the platform was ‘not designed to maximise time’. YouTube’s lawyers pushed a more radical claim: that YouTube is not social media at all. It is a streaming platform, they argued. Functionally television. They produced data showing Kaley had spent an average of about one minute per day on YouTube Shorts, the vertical short-form video feature that launched in 2020 and is the part of the product most directly comparable to TikTok’s infinite scroll.

One minute a day. It is the kind of statistic that sounds exonerating until you remember that the case was not about minutes but about mechanisms.

In 1994 — and the date matters, because it is now thirty-two years ago and the playbook has barely changed — the chief executives of America’s seven largest tobacco companies appeared before the House Energy and Commerce Committee’s Subcommittee on Health and the Environment. Each one swore, under oath, that he did not believe nicotine was addictive. Internal documents from Brown & Williamson, smuggled out by a paralegal and handed to Representative Henry Waxman, later proved every word of that testimony to be a lie. The tobacco industry’s strategy was to contest the science of addiction as long as it could, to place the risk on the consumer’s shoulders, and to insist that the product, sensibly used, caused no harm. John Uustal, a lawyer who fought the tobacco companies and who has been watching the social media trials closely, put the logic simply: ‘Legally, and morally and public-relations-wise, once you admit it’s addictive, you’re done.’

So you never admit it. You say ‘problematic use’. You say the science is complex. You say ‘profoundly complex’, because the adverb buys you another few years.

The analogy has limits, and it would be irresponsible not to name them. Tobacco delivers nicotine, a chemically addictive substance; the harm it does is countable in tumours and death certificates. Social media addiction — if it is a clinical entity at all — is behavioural. The current Diagnostic and Statistical Manual of Mental Disorders recognises only one form of behavioural addiction, which is gambling disorder. There is no consensus psychiatric diagnosis for social media addiction. Clay Calvert, a non-resident senior fellow of technology policy studies at the American Enterprise Institute, has called the tobacco analogy ‘deceptively flawed’, and he has a point: social media platforms carry First Amendment–protected speech, while cigarettes convey no expression whatsoever. Meta has argued, and not unreasonably, that social media brings real benefits to young people — community, creative expression, connection. Some researchers in the field agree.

But here is the thing about the 1990s tobacco settlement. It did not hinge on proving every smoker was an addict. It hinged on proving that the manufacturers knew the product was dangerous, marketed it to children regardless, and buried their own research. That is the structure K.G.M. v. Meta et al. lifted wholesale. The ‘tweens’ memo. The ‘pushers’ memo. The retention data on eleven-year-olds. The testimony about executive knowledge of the platforms’ effects on children. These documents did what the Brown & Williamson papers did three decades ago: they turned the company’s internal language into a public record of knowing complicity.

In 1998, five major tobacco companies settled with forty-six states. The bill has exceeded 200 billion dollars over twenty-five years. Whether the social media litigation will reach that scale is unknowable. What is knowable is that the legal architecture is now in place, and it is the same architecture: an industry is forced to reckon with what it did to children, not by the legislature (which failed) or the regulator (which was outgunned), but by the accumulated cost of losing at trial.

IV.

The timing was brutal. Twenty-four hours before the Los Angeles verdict, Meta had already lost.

On 24 March, a jury in Santa Fe found the company liable for violating New Mexico’s consumer protection laws and ordered it to pay 375 million dollars — five thousand dollars per violation, the statutory maximum, multiplied across thousands of children. The state’s attorney general, Raúl Torrez, had sued in 2023 after an undercover operation in which investigators created a fake profile of a thirteen-year-old girl. The account, Torrez said, was ‘simply inundated with images and targeted solicitations’ from predators within days. Days. If a decoy account operated by law enforcement could attract that volume of predatory contact that quickly, the implication was not subtle: Meta’s own systems, running at incomparably greater scale, had the same signals and did not act on them.

Evidence at trial included internal communications in which Meta employees had warned about the consequences of Zuckerberg’s 2019 decision to roll out end-to-end encryption on Facebook Messenger. The concern: encryption would cripple the company’s capacity to detect and report child sexual abuse material. The estimated volume: 7.5 million instances that would go unreported to law enforcement. New Mexico became the first state in America to win at trial against a major technology company for harming children. A second phase — a bench trial before Judge Bryan Biedscheid, scheduled for 4 May — will decide whether Meta created a public nuisance and can be compelled to make concrete design changes: real age verification, predator removal, and protections for minors against encrypted communications that shield abusers.

Two verdicts in forty-eight hours. Together they amount to something larger than a bad week for one corporation’s legal department. More than forty state attorneys general have filed suits against Meta. Over ten thousand individual cases and close to eight hundred from school districts are pending nationwide. A federal trial — consolidated claims by districts and families — is scheduled to begin in June in Oakland, in the Northern District of California. Matt Bergman, of the Social Media Victims Law Center, said the K.G.M. verdict ‘establishes a framework for how similar cases across the country will be evaluated’. It tells plaintiffs the theory works. It tells juries that the evidence is legible. It tells the companies that the shield they relied on for twenty-five years has a hole in it, and the hole is shaped like a design flaw.

V.

The American system does this through juries. Elsewhere, the tool has been legislated, and it has moved with startling speed.

Australia banned social media for under-sixteens in December 2025 — the first country on earth to do so, with fines of up to 49.5 million Australian dollars for platforms that fail to enforce it. France’s National Assembly passed an under-fifteen ban in January 2026, championed by Macron, with enforcement expected by September. Spain’s prime minister stood up at the World Government Summit in Dubai in February and announced his country would be the first in Europe to impose an under-sixteen ban, calling social media ‘a failed state, a place where laws are ignored and crime is endured’. Denmark secured cross-party backing for a ban on under-fifteens in late 2025. Portugal passed similar legislation in February. Germany commissioned a study. Norway is drafting a consultation proposal. The European Parliament, in a vote that was non-binding but carried with overwhelming support, called on the EU to set a minimum age of sixteen for social media and — this part is less widely reported — demanded a ban on infinite scrolling and autoplay features for minors.

That half a dozen European governments converged on essentially the same policy within six months says something about the political consensus, which is now broad enough to be boring. The interesting question is why it took this long. The EU’s Digital Services Act, the bloc’s main regulatory instrument for very large platforms, imposes transparency and risk-assessment obligations; it does not ban access by age. The national bans now emerging represent an implicit acknowledgement that asking platforms to assess their own risks and offer parental controls is not sufficient when the product’s core architecture — the scroll, the algorithm, the notification pulse — is engineered to maximise engagement regardless of who is engaging. You cannot put guardrails on a road that should not have been built.

This is the point at which the conversation gets difficult, and where GAY45.eu has a particular stake in pressing it. For LGBTQ+ young people, social media has been, and in many places remains, the only accessible community available. The first glimpse of people who share their experience. The difference between isolation and survival. The platforms that the litigation now calls addiction machines are, for some of their youngest users, also the thing that kept them alive. This is not a contradiction the regulatory debate has adequately reckoned with. A blunt age ban that shuts down access without building anything in its place — without funding youth services, without creating safe offline spaces for queer adolescents, without acknowledging that the reason these children were on their phones in the first place was partly that the world outside was hostile — risks solving one problem by compounding another. The populations most exposed to that risk are the ones for whom the offline world has offered the least.

VI.

Kaley was in the room when the verdict came. She is twenty. She still uses social media. This was reported in several outlets without comment, as though it were an oddity or an inconsistency in her story. It is neither. It is a description of the product. The platforms are built so that the cost of leaving — social, informational, connective — exceeds the psychological cost of staying. A rational person, even one who has testified in open court about the damage, will do the arithmetic and stay. That is not a failure of willpower. It is a feature of engineering.

Meta will appeal. Google will appeal. Both companies have said so. Meta called teen mental health ‘profoundly complex’ and said it ‘cannot be linked to a single app’. A Google spokesperson added that the case ‘misunderstands YouTube, which is a responsibly built streaming platform, not a social media site’. These statements may be individually defensible. They are also, word for word, the kind of thing tobacco executives said for thirty years before the money came due: the science is unsettled, causation is multifactorial, the consumer made choices. The statements are not wrong exactly. They are beside the point. The point is the memo about tweens. The point is the memo about pushers. The point is seven-and-a-half million instances of child sexual abuse material that would go undetected after an encryption decision made by a man who ‘always wished’ things moved faster.

Dona J. Fraser, senior vice-president for privacy initiatives at BBB National Programs, said after the verdicts that courts are starting to treat social media harm ‘less like a speech issue’ and ‘more like a product liability issue’. That sentence sounds technical. It is actually a sea change. Section 230 casts the platforms as publishers — neutral carriers of other people’s speech. K.G.M. reframes them as manufacturers. A publisher distributes what others create. A manufacturer builds the thing that does the damage. The jury in Los Angeles looked at the infinite scroll, the recommendation engine, the notification system, the filters, the metrics — and decided they were looking at a machine, not a library.

Jonathan Haidt, the psychologist behind The Anxious Generation, called it the start of ‘a new world’ for child safety. Maybe. Haidt has a tendency toward the epochal. What is less debatable is that a specific kind of impunity — the kind that attached to being too big, too new and too lucrative to be judged by the same rules as every other company that puts a product into the hands of a child — ended in a courtroom in Los Angeles on a Wednesday in March.

The parents on the courthouse steps were not celebrating six million dollars. They were celebrating a legal principle: that their grief had a name now, and the companies that built the grief could be made to answer for the building.

The next bellwether, R.K.C. v. Meta, is set for the summer. Two thousand cases behind it. Somewhere in Chico, a girl who was six when she first fell into the scroll is twenty now, and the scroll is still there, and the machine is still running, and the appeals will take years, and the children are still inside it.

 

Subscribe to our newsletter here and support independent journalism.
Join 12,000+ readers who receive it every Wednesday, with exclusive content. For just €5,99

✦✦✦

If you have a tip and wish to contact us securely, you can write to [email protected], our encrypted email address. We take the protection of our sources seriously and guarantee strict confidentiality.

✦✦✦

When we learn of a mistake, we acknowledge it with a correction. If you spot an error, please let us know.

✦✦✦

You can listen to our podcast  Queer News & Journalism on your favourite platform or go to our YouTube Channel @GAY45mag.

✦✦✦

Let us know what you think at [email protected].

✦✦✦

Submit a letter to the editor at [email protected].

✦✦✦

GAY45 uses AI with human guidance and editorial oversight. We deploy AI to analyse data, sift through large volumes of material for investigative reporting, assist in drafting headlines and summaries, and generate translations and audio versions of articles. Images created by AI for illustrative purposes are clearly labelled as such. We do not use AI to write articles. Journalists remain ultimately responsible for everything we publish.

✦✦✦

Support GAY45 button

We appreciate it. Thanks for reading.

Author

  • Jackson Williams is a San Francisco–born journalist whose work has appeared in The New York Times, the San Francisco Chronicle, and the Bay Area Reporter, where he covers politics, culture, and the intersection of race and queer identity.

    Jackson Williams is a staff writer for GAY45. He is a San Francisco–born journalist whose work has appeared in The New York Times, the San Francisco Chronicle, and the Bay Area Reporter, where he covers politics, culture, and the intersection of race and queer identity.

    View all posts
Did we mention we accept donations? Indeed, love.

If this story matters to you, help us tell the next one — donate what you can today.

Support GAY45
Follow on Feedly