A human rights case for abolishing behavioral manipulation systems.
An open-source tool that shows you exactly what the algorithm is doing to you.
Dedicated to the children killed by algorithmic violence
On February 3, 2026, nine-year-old JackLynn Blackwell went out to play in her backyard in Stephenville, Texas. Her father found her minutes later with a cord around her neck. She had been served a choking challenge video by an algorithm.
She is not an anomaly. She is a data point in a pattern that has been documented, studied, reported — and ignored — for over a decade. Behavioral Manipulation Systems (BMS) — the algorithmic engines that decide what you see next on every major social platform — are injuring and killing human beings as a direct and foreseeable consequence of how they are designed to operate.
These systems do not recommend content. They manipulate behavior. The distinction is not semantic. A recommendation serves your interest. A manipulation serves the platform's interest at your expense. They monitor your psychological responses in real time, identify your vulnerabilities, and serve you increasingly intense content to keep you on the platform — because your distress, your outrage, your grief are the product being sold to advertisers.
Read the full human rights case →"You could check on your kid, it could be kid-friendly videos, and then three minutes later it could be totally something dark because of the algorithms they start creating. There's too many of these kids lost for these companies not to be held accountable."
— Curtis Blackwell, father of JackLynn Blackwell, age 9. February 2026.
The Hoffman Browser is a full desktop browser that reads every page you visit the way an expert reads manipulation — not word by word, but all at once, understanding what the language is designed to do to you.
Named for the glasses in John Carpenter's They Live — once you see the hidden messages, you cannot unsee them. The browser works the same way. It does not tell you what to think. It shows you what is being done to you, and trusts you to think for yourself.
Analysis runs entirely on your device using a local AI model. No page content is ever transmitted anywhere. The website you are reading never knows it is being analyzed.
The Hoffman Browser is open source and always will be. It cannot be bought. It cannot be silenced. Once it is in the world, it belongs to the world.
This is a permanent, publicly accessible record of children killed by algorithmic violence — maintained with the consent and participation of their families. We are building this list. If you have lost a child and want their name here, contact us.
This list is growing. Help us make it complete.
Put on the glasses. Use the browser for one week on the sites you actually visit. Then decide what you think.
Get the browser →The full human rights case. Legally precise. Fully cited. Released under Creative Commons — reproduce it freely.
Read & download →Send the white paper to your elected representatives. Send it to journalists. Send it to other parents. This document belongs to everyone.
Download & share →Every major platform hides the chronological feed option in settings because they don't want you to use it. Find it. Use it.
How to do it →If you have lost a child to algorithmic violence, we want to hear from you. Your child's name belongs on the remembrance list — if you want it there.
Contact us →The browser is open source. Fork it. Improve it. Add OCR support. Help us read the text that platforms hide in images. This tool belongs to the world.
Contribute on GitHub →An independent, non-partisan, non-commercial project dedicated to making Behavioral Manipulation Systems visible, legally accountable, and ultimately obsolete.
Not funded by platforms, advertisers, or political organizations. We accept no money from any entity with a financial interest in the continuation of behavioral manipulation technology.
Named for the glasses in John Carpenter's 1988 film They Live — which allowed the wearer to see the hidden messages embedded in ordinary reality. Once you see what the machine is doing to you, you cannot unsee it.