-
JackLynn Blackwell
She loved karaoke. She wanted to be a star. She was nine years old.
-
Molly Russell
She was creative and bright. In 2022, a UK coroner ruled that Instagram and Pinterest content contributed to her death — the first ruling of its kind anywhere in the world. Her father, Ian Russell, turned grief into testimony and fought to make her name mean something beyond what happened to her.
-
Nylah Anderson
She was ten years old. Her family alleged that TikTok's algorithm served her a choking challenge that had spread virally across the platform. Her mother, Tawainna Anderson, sued TikTok and became a voice in the fight for algorithmic accountability. A federal appeals court ruled in 2024 that TikTok's algorithm itself — not just the content — may be subject to liability.
-
Amanda Todd
She made a video before she died — holding up hand-written cards asking for someone to notice her pain. Millions did, too late. Her case documented how online harassment and sextortion, amplified across platforms, could destroy a young person's life with no intervention from the systems that carried the content.
-
CJ Dawley
He signed up for Facebook, Instagram, and Snapchat at fourteen. In the three years that followed, the platforms kept him online until 3 a.m. on school nights, fed anxieties about body image, and built a dependency his parents watched develop and could not stop. He was admitted to college in December 2014. He worked as a busboy at a Texas Roadhouse in Kenosha. On January 5, 2015, he was seventeen years old. He held his phone in one hand. His mother filed a wrongful death lawsuit against Meta and Snap. His full name was Christopher James Dawley. He went by CJ.
-
Sadie Riggs
She had red hair and braces and was bullied for both — in the hallways of Bedford Senior High School and then online, where the cruelty continued across Snapchat, Instagram, and Kik without pause or boundary. In the spring of 2017, a school incident became another round of humiliation that spread through the platforms. She was fifteen years old and in counseling. On June 19, 2017, her aunt found her. The obituary her mother and aunt wrote did not spare the people responsible: "For the bullies involved, please know you were effective in making her feel worthless. That is all between you and God now." Bedford, Pennsylvania planted a tree in her name.
-
Englyn Roberts
She got her first phone at eleven. Instagram, Snapchat, and TikTok followed — and so did depression, anxiety, and self-harm, as the platforms served her content she had not sought. In September 2019, when she was thirteen, Instagram's algorithm delivered a video of a woman hanging herself. She shared it with a friend. A year later, on the morning of August 29, 2020, she replicated what she had seen. Her parents found her. They kept her on life support for nine days. On September 9, 2020, they brought her home. After she died, her father found the video still on her phone. It was still circulating on Instagram — with at least 1,500 documented views — until December 2021, more than a year after her death. CBS 60 Minutes reported her story. Her parents, Brandy and Toney Roberts, filed suit against Meta, Snap, and ByteDance. She was fourteen years old.
-
Frankie Thomas
She was adopted as an infant, diagnosed with autism at five, and raised by parents who understood her vulnerabilities and fought hard to protect her. Her school in Hindhead, Surrey had a written plan requiring supervised internet use specifically because of those vulnerabilities. On September 25, 2018, Frankie was left alone with an unfiltered school iPad for approximately two hours. She found self-harm content on Wattpad — a storytelling platform that had failed to prevent children in under-seventeen accounts from accessing stories depicting suicide. The last story she read mirrored what she did next. A UK coroner ruled in 2021 that the school's failure to follow her plan "more than minimally contributed" to her death, and formally criticised Wattpad's failures. Her parents, Judy and Andy Thomas, pursued accountability through the courts. Her case was cited in the UK Parliament and contributed to the passage of the Online Safety Act 2023. Her full name was Frances-Rose Thomas. She was known as Frankie.
What happened to them
Each person on this page died in circumstances where algorithmic systems played a documented or legally-established role. The platforms that delivered the content that reached them were not passive infrastructure. They were designed, optimized, and operated by people who had, in most cases, been warned.
The UK Coroner's ruling in Molly Russell's case — September 2022 — established for the first time that platform-delivered content formally contributed to a child's death. The finding was not opinion. It was a ruling. Instagram and Pinterest content played a role.
That ruling did not appear from nowhere. It was the result of years of work by her family, by investigators, and by the fact that the evidence existed and was allowed into a courtroom. This is what accountability looks like when the system permits it.
Why we keep this page
We keep this page because names matter. Because the harm these systems cause is not abstract. Because behind every statistic about teen mental health and algorithmic radicalization and platform-amplified self-harm is a specific person who had a name, and a family that is living with what happened.
We keep this page because the companies that built and operated these systems have spent enormous resources making the harm feel diffuse and unprovable. Part of Hoffman's work is to refuse that framing. The harm happened to specific people. We know some of their names. We say them here.
We keep this page because it is a reminder of what we are building Hoffman for. Not for market share. Not for funding rounds. For JackLynn Blackwell, who was nine years old, and wanted to be a star.
For families
If you have lost someone and believe algorithmic harm played a role, you are not alone. Information about legal organizations, how to preserve evidence, and how to connect with others going through the same thing is available on the Families page.
If you believe a name should be added to this page, we review every entry carefully and require verified public-record documentation before publishing. We do not add names without director approval. Contact us at families@hoffmanlenses.org.
The full case
The legal and ethical framework for why Behavioral Manipulation Systems constitute a documented, foreseeable, and ongoing harm is laid out in the Hoffman Lenses white paper.
They deserved better than to be engagement metrics.
This page will be updated as names are confirmed and approved for publication. There are names we have not yet learned.
If you know a name, contact us at families@hoffmanlenses.org.