The Hoffman Lenses Initiative

Making the invisible visible.

A human rights case for abolishing behavioral manipulation systems.
An open-source tool that shows you exactly what the algorithm is doing to you.

~80 children killed by one viral challenge
97% of recommended content remains harmful
eight years after Molly Russell's death
$0 the cost of Hoffman Lenses.
Forever.
scroll

Dedicated to the children killed by algorithmic violence

JackLynn Blackwell, 9 · Molly Russell, 14 · Nylah Anderson, 10 · CJ Dawley, 14 · Amanda Todd, 15 · Sadie Riggs, 15 · and hundreds more →

This is not about social media.
It is about the machine inside it.

On February 3, 2026, nine-year-old JackLynn Blackwell went out to play in her backyard in Stephenville, Texas. Her father found her minutes later with a cord around her neck. She had been served a choking challenge video by an algorithm.

She is not an anomaly. She is a data point in a pattern that has been documented, studied, reported — and ignored — for over a decade. Behavioral Manipulation Systems (BMS) — the algorithmic engines that decide what you see next on every major social platform — are injuring and killing human beings as a direct and foreseeable consequence of how they are designed to operate.

These systems do not recommend content. They manipulate behavior. The distinction is not semantic. A recommendation serves your interest. A manipulation serves the platform's interest at your expense. They monitor your psychological responses in real time, identify your vulnerabilities, and serve you increasingly intense content to keep you on the platform — because your distress, your outrage, your grief are the product being sold to advertisers.

Read the full human rights case →
I
Right to Life
ICCPR Article 6 · UNCRC Article 6
II
Right to Health
ICESCR Article 12 · UNCRC Article 24
III
Right to Privacy
ICCPR Article 17 · UNCRC Article 16
IV
Best Interests of the Child
UNCRC Article 3
V
Corporate Duty of Care
UN Guiding Principles on Business & Human Rights

"You could check on your kid, it could be kid-friendly videos, and then three minutes later it could be totally something dark because of the algorithms they start creating. There's too many of these kids lost for these companies not to be held accountable."

— Curtis Blackwell, father of JackLynn Blackwell, age 9. February 2026.

Put on the glasses.

The Hoffman Browser is a full desktop browser that reads every page you visit the way an expert reads manipulation — not word by word, but all at once, understanding what the language is designed to do to you.

Named for the glasses in John Carpenter's They Live — once you see the hidden messages, you cannot unsee them. The browser works the same way. It does not tell you what to think. It shows you what is being done to you, and trusts you to think for yourself.

Analysis runs entirely on your device using a local AI model. No page content is ever transmitted anywhere. The website you are reading never knows it is being analyzed.

Local AI reads the full page and identifies manipulation techniques — outrage engineering, war framing, false authority, tribal activation, and more
Quotes the exact language being used against you and explains the technique
"Why is this here?" — shows you who owns the site, their documented business model, and cases of documented harm
Reads rendered screen text — not DOM structure that platforms can hide or obfuscate
Works on every website — news, social media, blogs, political content, advertising
Zero data retention. All processing local. No exceptions.

The Hoffman Browser is open source and always will be. It cannot be bought. It cannot be silenced. Once it is in the world, it belongs to the world.

foxnews.com
war_framing — HIGH — "WAR WITH IRAN"
Frames a diplomatic situation as armed conflict to maximize alarm and urgency.
outrage_engineering — HIGH
Language calibrated for maximum emotional response rather than information.
Why is this here?
Fox Corporation · $14.7B revenue · Advertising model · 2 documented harm cases

They knew. They chose profit.

2017
Molly Russell dies. Meta's algorithm served the 14-year-old escalating self-harm content she never requested. A London coroner formally ruled it contributed to her death — the first time in history a child's death was officially attributed to algorithmic violence.
2021
The Facebook Papers. Whistleblower Frances Haugen reveals Meta's internal research: "We make body image issues worse for one in three teen girls." Recommendations to change the algorithm were overruled because they would reduce engagement metrics.
2021
Nylah Anderson, age 10, dies. TikTok's For You page served her the Blackout Challenge. Her family sued. A federal appeals court revived the case in 2024, ruling the algorithm itself — not user content — may be liable.
2024
Senate testimony. Platform CEOs appear before the Senate Judiciary Committee. Zuckerberg says he is "sorry for everything you have all been through." He does not apologize for the algorithm. Nothing changes.
2025
97% still harmful. Eight years after Molly Russell's death, the Molly Rose Foundation finds that 97% of content served to teen accounts engaging with depression material on Instagram remains harmful. The algorithm is unchanged.
2026
JackLynn Blackwell, age 9, dies. Stephenville, Texas. Found by her father. She loved karaoke. She wanted to be a star. TikTok and Snap settle a major California lawsuit days later — rather than explain their algorithms under oath in open court.

They deserved better than to be
engagement metrics.

This is a permanent, publicly accessible record of children killed by algorithmic violence — maintained with the consent and participation of their families. We are building this list. If you have lost a child and want their name here, contact us.

JackLynn Blackwell age 9 Stephenville, Texas · 2026
Molly Russell age 14 London, United Kingdom · 2017
Nylah Anderson age 10 Philadelphia, Pennsylvania · 2021
CJ Dawley age 14 Kenosha, Wisconsin
Amanda Todd age 15 British Columbia, Canada · 2012
Sadie Riggs age 15 Pennsylvania · 2015

This list is growing. Help us make it complete.

What you can do today.

01

Get the Hoffman Browser

Put on the glasses. Use the browser for one week on the sites you actually visit. Then decide what you think.

Get the browser →
02

Read the White Paper

The full human rights case. Legally precise. Fully cited. Released under Creative Commons — reproduce it freely.

Read & download →
03

Share the Case

Send the white paper to your elected representatives. Send it to journalists. Send it to other parents. This document belongs to everyone.

Download & share →
04

Switch to Chronological

Every major platform hides the chronological feed option in settings because they don't want you to use it. Find it. Use it.

How to do it →
05

For Families

If you have lost a child to algorithmic violence, we want to hear from you. Your child's name belongs on the remembrance list — if you want it there.

Contact us →
06

For Developers

The browser is open source. Fork it. Improve it. Add OCR support. Help us read the text that platforms hide in images. This tool belongs to the world.

Contribute on GitHub →

The Hoffman Lenses Initiative

An independent, non-partisan, non-commercial project dedicated to making Behavioral Manipulation Systems visible, legally accountable, and ultimately obsolete.

Not funded by platforms, advertisers, or political organizations. We accept no money from any entity with a financial interest in the continuation of behavioral manipulation technology.

Named for the glasses in John Carpenter's 1988 film They Live — which allowed the wearer to see the hidden messages embedded in ordinary reality. Once you see what the machine is doing to you, you cannot unsee it.