A nine-year-old went out to play.
On February 3, 2026, nine-year-old JackLynn Blackwell went out to play in her backyard in Stephenville, Texas. Her father found her minutes later with a cord around her neck. Her family says she had been served a choking challenge video by an algorithm — a claim TikTok and Snap settled rather than contest in open court.
JackLynn loved karaoke. She wanted to be a star. She was nine years old.
She is not an anomaly. She is a data point in a pattern that has been documented, studied, reported — and ignored — for over a decade. The platforms that designed the systems that served her that content had internal research, shareholder reports, and court proceedings telling them exactly what those systems were doing to children. They chose not to change them.
"You could check on your kid, it could be kid-friendly videos, and then three minutes later it could be totally something dark because of the algorithms they start creating. There's too many of these kids lost for these companies not to be held accountable."
— Curtis Blackwell, father of JackLynn Blackwell, age 9. February 2026.These systems do not recommend content. They manipulate behavior.
The distinction is not semantic. A recommendation serves your interest. A manipulation serves the platform's interest at your expense.
Behavioral Manipulation Systems (BMS) — the algorithmic engines that decide what you see next on every major social platform — monitor your psychological responses in real time, identify your vulnerabilities, and serve you increasingly intense content to keep you on the platform. Your distress, your outrage, your fear, your grief are the product being sold to advertisers.
A platform optimizing for your wellbeing would show you content that informed and connected you, then let you leave. A BMS optimizing for engagement-driven revenue shows you content engineered to keep you scrolling — and the content most effective at doing that is content that makes you feel something intense. Outrage. Fear. Shame. Tribal identity. Urgency. These are not side effects. They are the design.
An algorithmic content-ranking system that selects and sequences information to modify the behavior and emotional state of users in ways that serve platform revenue objectives, without the user's informed consent, using psychological profiling derived from behavioral monitoring.
Distinct from a recommendation system, which optimizes for user-defined preferences. A BMS optimizes for engagement metrics that correlate with advertising revenue, regardless of the user's stated preferences or documented wellbeing.
The most vulnerable users are not incidental casualties. They are the most profitable users. A child experiencing anxiety, depression, or social isolation is a child whose emotional responses are most easily exploited for engagement. The algorithm does not protect them. It targets them.
They knew. Year by year.
This is not speculation. The following is a chronological record of documented knowledge — internal research, whistleblower disclosures, court proceedings, and regulatory findings — establishing that platform executives were aware of the harms their systems were causing and chose not to change them.
-
2017Molly Russell dies. London, United Kingdom. Age 14.
Meta's Instagram algorithm served Molly Russell escalating self-harm and suicide-related content she never searched for or requested. In September 2022, a London coroner ruled that social media content contributed to her death — the first time a coroner formally issued such a ruling. The coroner found that the content Molly was served by the algorithm was "not safe" and was a "real and more than minimal contribution" to her death.
Meta argued in the inquest that it bore no responsibility for content served by its algorithm to a vulnerable 14-year-old.
-
2021The Facebook Papers. Internal research made public.
Whistleblower Frances Haugen provides internal Meta documents to regulators and journalists. The documents reveal that Meta's own researchers found the platform worsened body image issues for one in three teenage girls. Recommendations to change the algorithm to reduce harm were reviewed and overruled because the changes would reduce engagement metrics.
Haugen testifies before the US Senate: "Facebook knows that its amplification algorithms can lead children from innocuous topics — such as healthy food — to anorexia-promoting content over a very short period of time."
-
2021Nylah Anderson dies. Philadelphia, Pennsylvania. Age 10.
Nylah Anderson's family alleged that TikTok's For You Page algorithm served her the "Blackout Challenge" — a viral trend in which participants choke themselves until losing consciousness. They alleged she attempted the challenge and died. TikTok denied the algorithm served her the content.
In 2024, a federal appeals court ruled that the algorithm itself — not just the content it distributed — may be subject to liability. The court allowed the case to proceed, finding that the act of recommendation, not just the content recommended, could constitute actionable conduct.
-
2023State attorneys general file suit. 42 states.
Attorneys general from 42 US states file a federal lawsuit against Meta, alleging the company knowingly designed Instagram and Facebook to be addictive to children. The complaint cites internal Meta research and argues the company deliberately obscured the extent of its manipulation from parents, regulators, and the public.
-
2024Senate Judiciary Committee hearing. Platform CEOs testify.
Mark Zuckerberg, Shou Zi Chew, Evan Spiegel, Linda Yaccarino, and Jason Citron appear before the Senate Judiciary Committee. Bereaved families are present in the gallery, holding photographs of their children.
Zuckerberg says he is "sorry for everything you have all been through." He does not apologize for the algorithm. He does not commit to changing it. Nothing changes.
-
202597% still harmful. Eight years after Molly Russell's death.
A study commissioned by the Molly Rose Foundation tests what Instagram's algorithm serves to teen accounts that engage with depression-related content. 97% of algorithmically recommended content served to those accounts remains harmful — self-harm content, eating disorder content, or content that normalizes suicide. Eight years after Molly Russell's death. The algorithm is unchanged in this respect.
-
2026JackLynn Blackwell dies. Stephenville, Texas. Age 9.
Found by her father. Days later, TikTok and Snap settle a major California class action lawsuit rather than explain their algorithms under oath in open court. The settlement amount is not disclosed. The algorithm is unchanged.
This is a human rights violation. It has a name.
The conduct of behavioral manipulation platforms does not exist in a legal vacuum. It violates established international human rights instruments that bind both states and, under evolving frameworks, corporations operating transnationally.
The specific rights at issue, and the instruments establishing them:
The white paper builds the full legal argument across each of these frameworks, including the developing doctrine of corporate responsibility under international law, the specific application of Section 230 immunity limitations to algorithmic conduct, and the evidentiary record establishing the "knowing element" — that platforms were aware of the harm and chose not to act.
Read the full human rights case.
The white paper documents the legal argument in full — platform by platform, harm by harm, with primary source citations. Released under Creative Commons. Reproduce it freely. Send it to anyone who needs to see it.