According to newly unsealed federal court documents, Meta is accused of knowingly keeping the accounts of children under 13 open while harvesting their data without the consent of their parents. According to CNN, attorneys general in 33 states accused Meta of receiving more than a million reports of accounts held by users under the age of 13, and according to the complaint, “Meta disabled only a fraction of those accounts.”
According to CNN, the civil penalties sought in the suit from multiple states could eventually get into the millions of dollars due to the fact that Meta allegedly has millions of users who are teenagers and children. The states are seeking anywhere from $1,000 to $50,000 per
violation. Meta faces 54 counts in federal court and allegations that they violated the Children’s Online Privacy Protection Rule (COPPA), which says that companies can’t collect personal information from children without parental consent.The complaint alleges that Meta violated COPPA, using both its Facebook and Instagram platforms, and according to the complaint, “Meta’s own records reveal that Instagram’s audience composition includes millions of children under the age of 13” and “hundreds of thousands of teen users spend more than five hours a day on Instagram.”
In a statement to CNN, Meta responded, saying, “Instagram’s Terms of Use prohibit users under the age of 13 (or higher in certain countries) and we have measures in place to remove these accounts when we identify them. However, verifying the age of people online is a complex industry challenge,” Meta continued.
“Many people – particularly those under the age of 13 – don’t have an ID, for example. That’s why Meta is supporting federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps. With this approach, parents and teens won’t need to provide hundreds of individual apps with sensitive information like government IDs in order to verify their age.”
However, according to internal documents cited by the lawsuit, Meta’s employees knew and were concerned that their platform could have harmful effects on children. They were allegedly concerned that “content on IG triggering negative emotions among tweens and impacting their mental well-being (and) our ranking algorithms taking [them] into negative spirals & feedback loops that are hard to exit from.”
In addition to this, Meta’s own research allegedly indicated that Instagram’s algorithm, in particular, amplified “content with a tendency to cause users to feel worse about their body or appearance.” Meta’s researchers ultimately concluded was “valuable to Instagram’s business model while simultaneously causing harm to teen girls,” the outlet reports.
According to the lawsuit, Meta’s leadership knew its content could be a critical problem. Adam Mosseri, head of Instagram, allegedly wrote in an internal document, “Social comparison is to Instagram [what] election interference is to Facebook.” Meta has not yet responded to the statements allegedly attributed to Mosseri in the internal email.
Letitia James, Attorney General for the State of New York, one of the states suing Meta, told CNN in October, “Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem,” James continued, “Social media companies, including Meta, have contributed to a national youth mental health crisis and they must be held accountable.”
RELATED CONTENT: Morehouse College Professor Teaches Black History Course In The Metaverse