Meta, the parent company of Facebook, Instagram, WhatsApp, and Messenger, is facing lawsuits from 33 U.S. states, led by Colorado and California, over allegations of deliberately using features on its platforms to engage and deceive children. The joint lawsuit, filed in the U.S. District Court for the Northern District of California, accuses Meta of violating consumer protection laws by trapping children and misleading users about platform safety. The District of Columbia and eight other states have also filed separate lawsuits against Meta, with many making similar claims.
The states’ complaint asserts that Meta designed psychologically manipulative features to induce compulsive and extended use of platforms like Instagram among young users. They argue that the company’s algorithms pushed children and teenagers into harmful content rabbit holes using features like “infinite scroll” and persistent alerts. The lawsuit also accuses Meta of violating a federal children’s online privacy law by collecting the personal data of its youngest users without their parents’ permission.
The states’ 233-page lawsuit alleges that Meta harnessed powerful technologies to entice, engage, and ensnare youth and teenagers with the motive of profit. They claim that the company prioritized its profits over public health, specifically harming the youngest users.
Meta responded, saying that it was working to provide a safer environment for teenagers on its apps and had introduced over 30 tools to support teenagers and families. The company expressed disappointment that instead of collaborating with the industry to create clear, age-appropriate standards, the attorneys general had chosen to take legal action.
The lawsuits signal an unusual coalition of states coming together to sue a tech giant for consumer harms. This shows that states are making children’s online safety a priority and are pooling legal resources to challenge Meta, similar to their actions against Big Tobacco and Big Pharma companies in the past.
Lawmakers around the world have been working to regulate platforms like Instagram and TikTok to protect children. Several states, including California, Utah, and Britain, have passed laws to require social media platforms to enhance privacy and safety protections for minors online. Regulators have also aimed to hold social media companies responsible for possible harms to young people.
Concerns regarding Instagram’s potential harm to young people have been building for years. In early 2021, Facebook announced plans to develop “Instagram Kids,” a version of its app for users under 13, but it faced backlash from lawmakers and children’s groups. A group of attorneys general from over 40 states wrote a letter to Mark Zuckerberg, urging the company to abandon its plans for Instagram Kids. Concerns escalated after a former Facebook employee leaked company research showing the platform’s mental health risks to young users, leading to a pause in the development of Instagram Kids.
The attorneys general are seeking financial penalties from Meta under local and state consumer protection laws. They are also requesting injunctive relief from the court to force the company to stop using certain tech features that they allege have harmed young users.
Meta is expected to contest the lawsuit. The Colorado attorney general stated that he filed the lawsuit because a settlement could not be reached with the company. He noted that Meta had filed a motion to dismiss a separate lawsuit from consumers, which contains similar allegations of harm to children and teenagers.
Additionally, a group of attorneys general from over 40 states is conducting an ongoing investigation into user engagement practices at TikTok and their potential harm to young people. This investigation was announced in 2022 and continues to develop.