Meta apps
New Mexico seeks tighter child safety rules on Meta apps
New Mexico prosecutors are calling for major changes to Meta’s social media platforms, including Facebook and Instagram, to better protect children, as the second phase of a landmark court trial begins.
Opening arguments are set for Monday in a three-week bench trial that will decide whether Meta’s platforms create a public nuisance under state law by harming users, especially children.
In the first phase of the case, a jury ordered Meta to pay $375 million in civil penalties. It found that the company knowingly harmed children’s mental health and hid information about child sexual exploitation on its platforms.
Now, prosecutors are asking the court to force Meta to redesign key features of its apps. Their proposals include limiting addictive design tools, improving age verification systems, strengthening privacy settings for minors, and increasing safeguards against child exploitation.
They also want changes to how content is recommended, arguing that current algorithms push users toward endless engagement. Features such as infinite scrolling, push notifications, and visible “like” counts are also being targeted as drivers of compulsive use.
Another proposal would require child accounts to be linked with a parent or guardian and introduce a court-supervised monitor to track Meta’s compliance with safety improvements over time.
Meta has said it will appeal the jury’s verdict and warned it may suspend Facebook and Instagram services in New Mexico if forced to follow what it calls unrealistic requirements.
Legal experts say the case is unusual in how it challenges long-standing protections for internet companies. Eric Goldman of Santa Clara University said the legal theory of “public nuisance” is rarely applied to online platforms and may not fit well in this context.
New Mexico Attorney General Raúl Torrez said the verdict in the first phase weakened the protection tech companies have long relied on under Section 230 of the U.S. Communications Decency Act, which limits liability for user-generated content.
The case comes amid growing scrutiny of Big Tech, with another jury in Los Angeles also recently finding Meta and YouTube responsible for harms to children.
Prosecutors argue the court order could force a broader rethink of how social media companies operate, not just Meta. They say the goal is to address what they describe as a youth mental health crisis linked to platform design.
Meta, however, says it already invests heavily in child safety and claims many of the proposed measures are unnecessary or unworkable. The company also argues it is being unfairly singled out while many other apps used by teenagers face less regulation.
It has invoked free speech protections, saying the proposed rules could restrict expression and interfere with parental authority.
“The state’s proposed mandates infringe on parental rights and stifle free expression for all New Mexicans,” Meta said in a statement.
The trial is the first to reach court among more than 40 similar lawsuits filed by U.S. state attorneys general against Meta over youth mental health concerns. Many of those cases are being handled in federal courts.
Experts say the outcome could have wider implications for how social media companies are regulated in the future, especially if courts approve stricter requirements such as mandatory age verification.
The first phase of the trial included six weeks of testimony from educators, mental health experts, investigators, Meta executives and former employees.
7 hours ago