Mother Says Social Media Affected Her Daughter's Mental Health 2:38

(CNN) -- Since at least 2019, Meta has refused to intentionally shut down most accounts belonging to children under the age of 13 while collecting their personal information without parental consent, a newly unsealed court document from an ongoing federal lawsuit against the social media giant alleges.

Attorneys general from 33 U.S. states accused Meta of receiving more than one million reports of users under the age of 13 on Instagram from parents, friends, and online community members between early 2019 and mid-2023. However, "Meta deactivated only a fraction of those accounts," the complaint states.

The federal lawsuit seeks injunctions prohibiting Meta from practices that attorneys general allege violate the law. Civil penalties could run into the hundreds of millions of dollars as Meta allegedly hosts millions of teen and child users. Most states ask for between $1,000 and $50,000 fine per violation.

  • Dozens of states sue Meta, Instagram's parent company, over alleged "addictive" features and harms to young people's mental health

In this photo illustration, the Instagram logo lights up on a smartphone on October 6, 2023 in Berlin, Germany. (Photo illustration by Thomas Trutschel/Photothek via Getty Images)

Violation of privacy

According to the 54-count lawsuit, Meta violated a number of state consumer protection laws, as well as the Children's Online Privacy Protection Rule (COPPA), which prohibits companies from collecting personal information from children under the age of 13 without parental consent. Meta allegedly failed to comply with COPPA regarding Facebook and Instagram, despite the fact that "Meta's own records reveal that Instagram's audience composition includes millions of children under the age of 13," and that "hundreds of thousands of teen users spend more than five hours a day on Instagram," The court document states.

A Meta product designer wrote in an internal email that "young people are the best," adding that "you want to attract people to your service, young and soon," according to the lawsuit.

advertising

"Instagram's terms of use prohibit users under the age of 13 (or older in certain countries) and we have measures in place to remove these accounts when we identify them. However, verifying people's age online is a complex challenge for the industry," Meta told CNN in a statement on Sunday. "Many people – particularly those under the age of 13 – do not have an identity card, for example. That's why Meta supports federal legislation that requires app stores to get parental approval whenever their children under the age of 16 download apps. With this approach, parents and teens won't have to provide hundreds of individual apps with sensitive information, such as ID, to verify their age."

  • TikTok Launches Mental Health Resources for Its Users as Instagram Faces Criticism

Meta announces revamped monitoring for minors 0:41

Contents that are harmful to the mind

The declassified complaint also alleges that Meta knew that its algorithm could direct children to harmful content, thereby harming their well-being. According to internal company communications cited in the document, employees wrote that they were concerned about "content on IG that triggers negative emotions among tweens and affects their mental well-being (and) our ranking algorithms that lead them into negative spirals and feedback loops that are difficult to break out of."

For example, Meta researchers conducted a study in July 2021 that concluded that Instagram's algorithm may be amplifying negative social comparison and "content with a tendency to make users feel worse about their body or appearance," according to the complaint. In internal February 2021 emails cited in the lawsuit, Meta employees purportedly acknowledged that the social comparison was "associated with increased time spent" on Meta's social media platforms and discussed how this phenomenon is "valuable to Instagram's business model while simultaneously causing harm to teenage girls."

In an internal March 2021 investigation that analyzed content about eating disorders, the Meta team tracked users whose account names referenced starvation, thinness, and eating disorders. Instagram's algorithm then began generating a list of recommended accounts "that included accounts related to anorexia," the lawsuit states.

However, Antigone Davis, Meta's global head of safety, testified before Congress in September 2021 that Meta does not "direct people toward content that promotes eating disorders. In reality, that violates our policies and we remove that content when we become aware of it. We actually use AI to find content like that and remove it."

"We want teens to have safe and age-appropriate online experiences, and we have more than 30 tools to support them and their parents," Meta told CNN in a statement. "We've spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online. The complaint mischaracterizes our work using selective citations and carefully chosen documents."

Teens use social media to self-diagnose 2:09

Instagram's top brass also knew that problematic content was a critical issue for the platform, the lawsuit states. Adam Mosseri, the head of Instagram, allegedly wrote in an internal email that "social comparison is to Instagram [what] election interference is to Facebook." The lawsuit does not specify when that email was sent.

CNN reached out to Meta regarding Davis and Mosseri's comments and did not immediately hear back.

However, despite the company's internal investigation confirming concerns about social comparison on its platforms, the lawsuit alleges that Meta refused to change its algorithm. One employee noted in internal communications cited in the lawsuit that content that incites negative comparisons of appearance "is part of the most engaging content (on the Explore page), so this idea actively runs counter to many other top-line measures of teams." Meanwhile, "Meta's external communications denied or concealed the fact that its Recommendation Algorithms promote Highly Negative Appearance Comparison content among young users," the lawsuit states.

Meta was also aware that its recommendation algorithms "trigger intermittent releases of dopamine in young users" that can lead to addictive cycles of consumption on its platforms, according to internal documents cited in the lawsuit.

"Meta has profited from children's pain by intentionally designing its platforms with manipulative features that addict children to its platforms while lowering their self-esteem," New York Attorney General Letitia James said in a statement last month. New York is one of the states implicated in the federal lawsuit. "Social media companies, including Meta, have contributed to a national youth mental health crisis and need to be held accountable," James said.

Eight other attorneys general sued Meta last month in various state courts, making claims similar to the massive multistate federal lawsuit. Florida sued Meta in its own federal trial, alleging that the company misled users about the potential health risks of its products.

The wave of lawsuits is the result of a bipartisan, multistate investigation dating back to 2021, after Facebook whistleblower Frances Haugen came forward with tens of thousands of internal company documents that she said showed how the company knew its products could have negative effects on young people's mental health.

CNN's Brian Fung contributed to this report.

InstagramMental Health