The Hoffman Lenses Initiative

Making the invisible visible.

A human rights case for abolishing behavioral manipulation systems.
An open-source tool that shows you exactly what the algorithm is doing to you.

~80 children killed by one viral challenge
97% of recommended content remains harmful
eight years after Molly Russell's death
$0 the cost of Hoffman Lenses.
Forever.
scroll

Dedicated to the children killed by algorithmic violence

JackLynn Blackwell, 9 · Molly Russell, 14 · Nylah Anderson, 10 · CJ Dawley, 14 · Amanda Todd, 15 · Sadie Riggs, 15 · and hundreds more →

This is not about social media.
It is about the machine inside it.

On February 3, 2026, nine-year-old JackLynn Blackwell went out to play in her backyard in Stephenville, Texas. Her father found her minutes later with a cord around her neck. She had been served a choking challenge video by an algorithm.

She is not an anomaly. She is a data point in a pattern that has been documented, studied, reported — and ignored — for over a decade. Behavioral Manipulation Systems (BMS) — the algorithmic engines that decide what you see next on every major social platform — are injuring and killing human beings as a direct and foreseeable consequence of how they are designed to operate.

These systems do not recommend content. They manipulate behavior. The distinction is not semantic. A recommendation serves your interest. A manipulation serves the platform's interest at your expense. They monitor your psychological responses in real time, identify your vulnerabilities, and serve you increasingly intense content to keep you on the platform — because your distress, your outrage, your grief are the product being sold to advertisers.

Read the full human rights case →
I
Right to Life
ICCPR Article 6 · UNCRC Article 6
II
Right to Health
ICESCR Article 12 · UNCRC Article 24
III
Right to Privacy
ICCPR Article 17 · UNCRC Article 16
IV
Best Interests of the Child
UNCRC Article 3
V
Corporate Duty of Care
UN Guiding Principles on Business & Human Rights

"You could check on your kid, it could be kid-friendly videos, and then three minutes later it could be totally something dark because of the algorithms they start creating. There's too many of these kids lost for these companies not to be held accountable."

— Curtis Blackwell, father of JackLynn Blackwell, age 9. February 2026.

Put on the glasses.

The Hoffman Lenses browser extension overlays real-time annotation on your social media feeds — showing you exactly what the algorithm is doing to you, as it does it.

Named for the glasses in John Carpenter's They Live — once you see the hidden messages, you cannot unsee them. The extension works the same way. It does not tell you what to think. It shows you what is being done to you, and trusts you to think for yourself.

Exposes non-chronological feed manipulation in real time
Shows ratio of chosen content vs. algorithmically inserted content
Identifies emotional escalation patterns as they occur
Tracks session duration and psychological content profile
Works on Facebook, Instagram, X, TikTok, YouTube

The extension is open source and always will be. It cannot be bought. It cannot be silenced. Once it is in the world, it belongs to the world.

facebook.com/feed
Post is 3 days old — surfaced now because you paused on similar content 4 min ago
NOT from your network — algorithmically inserted. Emotional trigger detected: outrage
You have been scrolling for 18 min. 11 of 14 posts were algorithmically selected.
Session profile
Outrage 71%
Anxiety 18%
Chosen content 11%

They knew. They chose profit.

2017
Molly Russell dies. Meta's algorithm served the 14-year-old escalating self-harm content she never requested. A London coroner formally ruled it contributed to her death — the first time in history a child's death was officially attributed to algorithmic violence.
2021
The Facebook Papers. Whistleblower Frances Haugen reveals Meta's internal research: "We make body image issues worse for one in three teen girls." Recommendations to change the algorithm were overruled because they would reduce engagement metrics.
2021
Nylah Anderson, age 10, dies. TikTok's For You page served her the Blackout Challenge. Her family sued. A federal appeals court revived the case in 2024, ruling the algorithm itself — not user content — may be liable.
2024
Senate testimony. Platform CEOs appear before the Senate Judiciary Committee. Zuckerberg says he is "sorry for everything you have all been through." He does not apologize for the algorithm. Nothing changes.
2025
97% still harmful. Eight years after Molly Russell's death, the Molly Rose Foundation finds that 97% of content served to teen accounts engaging with depression material on Instagram remains harmful. The algorithm is unchanged.
2026
JackLynn Blackwell, age 9, dies. Stephenville, Texas. Found by her father. She loved karaoke. She wanted to be a star. TikTok and Snap settle a major California lawsuit days later — rather than explain their algorithms under oath in open court.

They deserved better than to be
engagement metrics.

This is a permanent, publicly accessible record of children killed by algorithmic violence — maintained with the consent and participation of their families. We are building this list. If you have lost a child and want their name here, contact us.

JackLynn Blackwell age 9 Stephenville, Texas · 2026
Molly Russell age 14 London, United Kingdom · 2017
Nylah Anderson age 10 Philadelphia, Pennsylvania · 2021
CJ Dawley age 14 Kenosha, Wisconsin
Amanda Todd age 15 British Columbia, Canada · 2012
Sadie Riggs age 15 Pennsylvania · 2015

This list is growing. Help us make it complete.

What you can do today.

01

Install Hoffman Lenses

Put on the glasses. Use the extension for one week on your actual social media feeds. Then decide what you think.

Get the extension →
02

Read the White Paper

The full human rights case. Legally precise. Fully cited. Released under Creative Commons — reproduce it freely.

Read & download →
03

Share the Case

Send the white paper to your elected representatives. Send it to journalists. Send it to other parents. This document belongs to everyone.

Download & share →
04

Switch to Chronological

Every major platform hides the chronological feed option in settings because they don't want you to use it. Find it. Use it.

How to do it →
05

For Families

If you have lost a child to algorithmic violence, we want to hear from you. Your child's name belongs on the remembrance list — if you want it there.

Contact us →
06

For Developers

The extension is open source. Fork it. Improve it. Translate it. Add platform support. This tool belongs to the world.

Contribute on GitHub →

The Hoffman Lenses Initiative

An independent, non-partisan, non-commercial project dedicated to making Behavioral Manipulation Systems visible, legally accountable, and ultimately obsolete.

Not funded by platforms, advertisers, or political organizations. We accept no money from any entity with a financial interest in the continuation of behavioral manipulation technology.

Named for the glasses in John Carpenter's 1988 film They Live — which allowed the wearer to see the hidden messages embedded in ordinary reality. Once you see what the machine is doing to you, you cannot unsee it.