Meta Faces Allegations of Knowingly Allowing Underage Users and Collecting Their Data Without Consent

Meta Faces Allegations of Knowingly Allowing Underage Users and Collecting Their Data Without Consent

Since at least 2019, Meta has been accused in an ongoing federal lawsuit of knowingly allowing the majority of accounts belonging to children under 13 to remain active, all while collecting their personal information without parental consent, as revealed in a recently unsealed court document.

The lawsuit, filed by attorneys general from 33 states, alleges that Meta received over a million reports of under-13 users on Instagram between early 2019 and mid-2023. Despite this, the complaint claims that Meta only disabled a fraction of these accounts. The attorneys general are seeking court orders to prohibit Meta from these alleged unlawful practices, with potential civil penalties reaching hundreds of millions of dollars, considering Meta’s extensive user base of teens and children.

The 54-count lawsuit accuses Meta of violating state-based consumer protection statutes and the Children’s Online Privacy Protection Rule (COPPA), which prohibits the collection of personal information from children under 13 without parental consent. The complaint argues that Meta did not comply with COPPA for both Facebook and Instagram, even though the company’s records suggest the presence of millions of children under 13 on Instagram.

Meta Faces Allegations for underaged users image

The complaint also highlights an internal email from a Meta product designer expressing the sentiment that “young ones are the best ones.”

Meta responded to the allegations, stating that age verification online is a challenging issue, especially for those under 13. The company emphasized its support for federal legislation requiring parental approval for teens under 16 downloading apps, aiming to simplify age verification without compromising sensitive information.

The lawsuit further alleges that Meta was aware that its algorithm could direct children toward harmful content, adversely affecting their well-being. Internal communications indicated employee concerns about Instagram’s algorithm contributing to negative emotions among tweens. A study in July 2021 suggested that Instagram’s algorithm might amplify negative social comparison and content related to body image.

Despite Meta’s claims that it does not promote content that encourages eating disorders, the lawsuit cites an internal investigation in March 2021 that found Instagram’s algorithm generating recommended accounts related to anorexia based on users’ references to starvation and disordered eating.

While Meta disputes the allegations, claiming a mischaracterization of their efforts, the lawsuit points to internal communications suggesting that the company was aware of issues related to social comparison on its platforms. The lawsuit argues that Meta refused to alter its algorithm despite acknowledging concerns about content causing negative appearance comparisons.

Additionally, the lawsuit asserts that Meta was conscious of its recommendation algorithms triggering dopamine releases in young users, potentially leading to addictive behavior on its platforms.

New York Attorney General Letitia James stated that Meta intentionally designed its platforms with manipulative features, profiting from children’s pain, contributing to a national youth mental health crisis, and emphasizing the need for accountability.

This lawsuit is part of a broader wave of legal actions resulting from a bipartisan, multistate investigation initiated in 2021 after Facebook whistleblower Frances Haugen disclosed internal documents suggesting the company’s awareness of its products’ adverse effects on young people’s mental health.

Related Posts