Mark Zuckerberg is set to testify on Wednesday in a high-profile Los Angeles trial that has been described by experts as the social media industry’s “Big Tobacco” moment.
The case, which began in late January 2026 at Los Angeles Superior Court, centers on a young woman who alleges she became addicted to social media and video streaming platforms, including Instagram and YouTube.
Lawyers for the plaintiff claim that Meta Platforms, Instagram chief Adam Mosseri testified that while he acknowledges problematic usage can occur, he does not equate social media usage with clinical addiction. “So it’s a personal thing, but yeah, I do think it’s possible to use Instagram more than you feel good about. Too much is relative, it’s personal,” Mosseri said.
The Los Angeles trial is one of several major court cases this year scrutinizing the social media industry, highlighting potential harm caused by platforms widely used by young people. Experts have compared the proceedings to the “Big Tobacco” lawsuits, in which tobacco companies were found liable for concealing the risks of smoking despite evidence of health harms.

In addition to the LA trial, Meta is facing legal challenges in other U.S. jurisdictions. In New Mexico, the state’s attorney general, Raúl Torrez, has filed a lawsuit alleging that Meta failed to adequately protect children and young users from online predators. These cases collectively reflect growing scrutiny over the ethical responsibilities of social media companies regarding user safety, particularly among minors.
Mark Zuckerberg’s testimony in Los Angeles is expected to address Meta’s design and safety policies, as well as the company’s awareness of research on social media’s impact on mental health. Observers say the trial could have far-reaching implications for platform design, regulation, and corporate accountability, potentially influencing how social media platforms approach user protection worldwide.
The case underscores the tension between user engagement, addictive design features, and corporate responsibility in the rapidly evolving digital environment. As more young users engage with social media daily, courts and regulators are increasingly asking whether tech companies are doing enough to prevent harm, especially when their platforms are deeply integrated into everyday life.

Legal analysts note that the outcome of the Los Angeles trial may set precedents for liability, disclosure, and industry-wide standards for social media safety. With Mark Zuckerberg
The Los Angeles trial comes amid increasing scrutiny of the social media industry over its impact on mental health, particularly for adolescents and young adults. In recent years, research from universities and public health organizations has raised concerns about the link between excessive social media usage and anxiety, depression, and other psychological effects. Reports suggest that design features such as algorithmic feeds, notifications, and “likes” systems can encourage extended engagement, which critics argue may contribute to compulsive use and behavioral dependency.
Legal experts have compared the case to the landmark tobacco lawsuits in the United States, where companies were held accountable for deliberately obscuring the health risks of their products. Social media platforms are now facing similar allegations: that they knew about potential harms, yet failed to adequately warn users or mitigate risks, prioritizing engagement and advertising revenue instead.
Beyond Los Angeles, Meta faces other significant legal challenges. In New Mexico, state authorities have accused Meta of failing to protect children from online predators, arguing that the company’s platforms allowed unsafe interactions despite internal research and safeguards. Internationally, Meta, TikTok, and other social media firms are under investigation in Europe, the UK, and Australia over children’s privacy, content moderation, and digital wellbeing. These cases form part of a broader trend of regulatory attention focusing on youth protection, algorithmic accountability, and data privacy.
Instagram chief Adam Mosseri’s testimony last week highlighted internal debates within Meta about whether heavy social media usage constitutes clinical addiction. While Mosseri acknowledged that some users may overuse the platform, he emphasized that the issue is “personal” and varies by individual, a stance likely to be scrutinized in the trial as jurors consider the company’s responsibility to prevent harm.
Meta has defended its approach, citing research, educational tools, and parental controls designed to help users manage their time online. However, consumer advocates argue that such measures are insufficient, pointing to internal company documents that suggest Meta’s awareness of how its algorithms can exploit psychological vulnerabilities, especially among minors.
The LA trial may have far-reaching implications for the social media industry. If jurors find Meta liable, it could open the door to new lawsuits, stricter regulatory frameworks, and mandatory changes to platform design and content policies. Policymakers in the United States and abroad are already exploring legislation that would require greater transparency in algorithmic systems, stronger age verification, and limits on addictive engagement features.
For investors and industry observers, the proceedings signal a turning point in public expectations and corporate accountability. Companies like Meta are balancing user growth with increasing legal and reputational risk, while global regulators seek to hold platforms to higher safety standards. The trial is being closely watched as an early test case that could influence corporate behavior, legal precedents, and public policy surrounding social media safety for years to come.