Your legal options may be broader than you know.
Courts are actively reconsidering platform immunity. A federal appeals court ruled in 2024 that recommendation algorithms — not just hosted content — may be subject to liability. Several law firms now specialize in platform accountability cases at no upfront cost to families.
Social Media Victims Law Center
Specializes in cases involving social media harm to minors. Free case evaluation. Has represented families in suits against Meta, TikTok, Snap, and Google.
socialmediavictims.orgMolly Rose Foundation
Founded by Ian Russell after the death of his daughter Molly. Campaigns for platform accountability and provides support to bereaved families in the UK and internationally.
mollyrosefoundation.orgElectronic Frontier Foundation
Tracks platform liability cases and Section 230 reform. The EFF's case database documents relevant rulings on algorithmic harm and content recommendation.
eff.orgCenter for Humane Technology
Founded by former tech insiders. Publishes research on algorithmic harm and supports families seeking to make documented cases in policy and legal contexts.
humanetech.comThe white paper produced by the Hoffman Lenses Initiative documents the known evidence of platform harm and the legal theory of the case. It is available for use by families, attorneys, and advocates under Creative Commons license.
Evidence you may not know you have.
If your child used a social media platform, that platform holds records of every piece of content served to them by its algorithm. In the United States, you can request this data from platforms under existing data portability laws. In the UK and EU, GDPR gives you a right to this data regardless of whether legal proceedings are underway.
Preserve the following immediately, before anything is deleted or accounts are closed:
- The child's device, ideally unmodified
- Any social media accounts (do not delete; preserve login credentials)
- Screenshots of content that was served, if accessible
- Screen time reports, if enabled on the device
- Any messages, searches, or saved content
Submit a data access request to each platform your child used. Most platforms provide a downloadable archive of account activity including content served, watch history, and interaction signals. This data is what the algorithm used to build the profile it used to target your child.
An attorney specializing in platform cases can issue a litigation hold letter, which legally obligates the platform to preserve records beyond their standard deletion schedules. This should be done as early as possible.
Testimony changes laws. Yours may be what tips the balance.
The families who have spoken publicly — Ian Russell, Matthew Bergman's clients, the Andersons, the Blackwells — have moved cases and legislation forward in ways that a decade of academic research could not. Platforms count on families remaining silent in their grief. The ones who have not stayed silent have changed what is politically and legally possible.
You should never feel obligated to speak publicly. Advocacy takes many forms, including private testimony to regulators, letters to elected officials, or simply allowing your family's case to proceed in court. But if you are willing to share your experience, organizations exist to amplify your voice.
ParentsTogether Foundation
Connects bereaved families with congressional offices and coordinates testimony for platform accountability hearings.
parents-together.orgStop Hate for Profit
Works with families affected by platform harms to support media outreach and corporate advertiser pressure campaigns.
stophateforprofit.orgMaking the harm visible — for everyone who comes after.
The Hoffman Browser is an open-source desktop browser that analyzes any webpage for behavioral manipulation techniques in real time, using a local AI model. Nothing leaves your device. No data is collected. No account is required.
The Behavioral Manipulation Intelligence Database (BMID) documents, platform by platform, what each company knew and when, what harms have been documented and by whom, and what the financial motive behind each design decision is.
Both tools are free. Both are open-source. Both are built specifically so that the argument "we didn't know" becomes impossible to sustain — in court, in legislatures, and in public.
Reach us directly.
If you have lost a child or are supporting a family who has, you can reach the Hoffman Lenses Initiative directly. We are not a law firm and we cannot provide legal advice, but we can connect you with organizations that can, share documentation that may support your case, and listen.
We will respond to every message from a bereaved family. That is not a policy. It is a commitment.
families@hoffmanlenses.orgYour email is not shared with any third party, ever. We do not use email marketing software. Emails to this address are read by a human being and replied to by one.
If you prefer additional privacy, you may contact us via ProtonMail at the same address, which supports end-to-end encrypted email from other ProtonMail accounts.
"Molly did not take her own life. She was killed by an algorithm that served her content it knew was harmful, to a child it knew was vulnerable, because engagement is more profitable than safety." — Ian Russell, father of Molly Russell, 14