New Mexico: Meta Platforms, the parent company of Facebook, Instagram, and WhatsApp, is set to face a historic civil trial in New Mexico, accused of exposing children and teenagers to sexual exploitation while allegedly prioritizing engagement over user safety. The trial, commencing next week in Santa Fe District Court, represents the first-time claims of this nature against the social media giant have reached the jury phase.
The lawsuit, filed in 2023 by New Mexico Attorney General Raúl Torrez, asserts that Meta’s platforms systematically allowed adults to access minors, sometimes facilitating real-world abuse and even human trafficking. The complaint claims the company’s internal policies and design choices created an environment that made children vulnerable to predators. Prosecutors argue that features like endless scrolling, autoplay videos, and algorithmic recommendations not only keep young users engaged but also expose them to significant risks.
Central to the state’s case are undercover operations in which investigators created accounts pretending to be underage users. According to court filings, these decoy accounts received sexually explicit material and were contacted by adults seeking illicit interactions, some of whom faced subsequent criminal charges. The lawsuit portrays these incidents as evidence of Meta’s alleged failure to implement even basic safeguards to protect minors.
The state also plans to highlight internal Meta documents indicating that company executives were aware of these dangers yet failed to act effectively. Prosecutors claim the company misrepresented its safety protocols for children and did not enforce adequate age verification, enabling exploitation to persist. Of particular concern is evidence surrounding Meta’s artificial intelligence chatbots, which, according to past internal policy, could engage in romantic or sensual dialogue with minors a practice the company says has since been discontinued.
Meta has strongly denied the allegations, emphasizing the safety mechanisms it has built, including content moderation, parental controls, and cooperation with law enforcement to combat exploitation. In legal filings, the company has invoked the First Amendment and Section 230 of the U.S. Communications Decency Act, protections that shield online platforms from liability for user-generated content. Meta also highlights that it routinely disabled millions of accounts in 2023 for violating child safety policies.
The New Mexico case is part of a larger wave of lawsuits across the United States targeting social media companies over the impact of their platforms on minors. Experts and observers suggest the outcome could set a precedent for how courts evaluate the responsibility of digital platforms to protect vulnerable users, particularly in an era dominated by engagement-driven algorithms.
The trial is expected to last seven to eight weeks, with prosecutors presenting internal documents, expert testimony, and evidence collected over more than two years. The case has drawn national attention, raising broader questions about the balance between free expression, platform accountability, and child safety in the digital age.