The attorneys general say they will look into whether, by continuing to provide and promote Instagram despite knowing of the potential harms, Meta violated consumer protection laws and “put the public at risk.” The states involved include California, Florida, Kentucky and Vermont.
“Facebook, now Meta, has failed to protect young people on its platforms and instead chose to ignore or, in some cases, double down on known manipulations that pose a real threat to physical and mental health — exploiting children in the interest of profit,” Massachusetts Attorney General Maura Healey, who is co-leading the investigation, said in a statement. She added that the coalition hopes to “get to the bottom of this company’s engagement with young users, identify any unlawful practices, and end these abuses for good.”
Meta (FB) spokesperson Andy Stone said in a statement that the allegations made by the attorneys general are false and said they “demonstrate a deep misunderstanding of the facts.” He also noted that the company plans to launch features to help teens regulate their use of Instagram, such as a “Take a Break” reminder, which was announced in October amid intense scrutiny.
“While challenges in protecting young people online impact the entire industry, we’ve led the industry in combating bullying and supporting people struggling with suicidal thoughts, self-injury, and eating disorders,” the statement reads. “We continue to develop parental supervision controls and are exploring ways to provide even more age-appropriate experiences for teens by default.”
The investigation is the latest escalation of regulatory pressure on Meta related to findings from the leaked internal documents, which have come to be known as the Facebook Papers. Earlier this week, Ohio Attorney General Dave Yost filed a lawsuit against the company alleging that it misled the public about its algorithm and the harms its apps can cause to users, causing losses for shareholders when those things were revealed. (Meta says the suit is without merit.)
Senator Richard Blumenthal has also asked Meta CEO Mark Zuckerberg to testify about the effects of Instagram on children.
The Wall Street Journal first reported in September on what the company’s own documents and research show about the potential harms to young people from its apps, and said Facebook knew Instagram was “toxic” for teen girls. In one internal report from 2019 on the mental health effects of Instagram cited by the Journal and reviewed by CNN Business, company researchers said “we make body image issues worse for 1 in 3 teen girls.” Meta has also pushed back on the Journal’s reporting, and said its apps do more good than harm.
Following the Journal’s report, a Senate subcommittee called a hearing with Facebook head of global safety Antigone Davis, where lawmakers grilled her on Instagram’s effects on kids. Davis said the company was “looking for ways to release more research” that she suggested might paint a different picture about the platform. Haugen has also testified to lawmakers that she believes Meta’s platforms “harm children, stoke division, and weaken our democracy.”
The company announced it was pausing plans to develop a version of Instagram designed for kids in late September, amid the fallout from the Journal report.
“While we stand by the need to develop this experience, we’ve decided to pause this project,” Adam Mosseri, head of Instagram, wrote in a blog post at the time. “This will give us time to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today.”