Snapchat and TikTok Settle California Social Media Addiction Lawsuit as Claims Against Meta and YouTube Continue

by Alexandra Agraz | Jan 28, 2026
Photo Source: Adobe Stock Image

Two major social media companies have reached settlements in a California lawsuit alleging that popular platforms contributed to youth mental health harm through features designed to keep users engaged for extended periods. The settlements narrow the case as it continues against the remaining companies.

Snap Inc., the parent company of Snapchat, and ByteDance-owned TikTok have each agreed to resolve claims brought by a 19-year-old California woman identified in court filings as K.G.M. The agreements come as claims against other platforms continue in Los Angeles County Superior Court.

TikTok reached an agreement in principle to settle the case on Tuesday, according to a lawyer involved in the matter. The terms were not disclosed. Snapchat previously settled with K.G.M. on January 20, also without releasing financial details. Both companies have denied wrongdoing in connection with the allegations.

The case centers on claims that social media platforms use design features that encourage prolonged and repetitive use by young users. Court filings cite tools such as notifications, algorithm-driven recommendations, and continuous scrolling as features intended to keep users engaged without sufficient protections for minors.

K.G.M. alleges she began using multiple social media platforms at a young age and later developed serious mental health problems. According to the filings, she links her depression and suicidal thoughts to extended use of the apps and seeks to hold the companies that designed them legally responsible.

The lawsuit is one of several test cases selected from hundreds of similar claims filed nationwide by individuals, parents, and school districts. These test cases, known as bellwether trials, are used in complex litigation to examine shared legal and factual questions before courts address the broader group of claims.

A key ruling last year allowed the lawsuit to proceed. A Los Angeles judge determined that the claims could move forward because they focus on how the platforms are built and operated rather than on content created by users. That distinction determines whether the companies can face legal responsibility.

Social media companies have long relied on Section 230 of the Communications Decency Act, a federal law that generally shields online platforms from liability for content posted by users. The statute typically applies when alleged harm is tied to what users say or share online. It does not automatically protect companies from claims that their own products were designed in a way that causes harm.

By allowing the lawsuit to continue, the court recognized that claims centered on platform design and engagement tools may fall outside the scope of that legal protection. The decision reflects a legal focus on how digital products function, rather than only on the speech they host.

The claims rely on legal theories that include negligence, product liability, and failure to warn. The lawsuit argues that social media companies had a responsibility to design reasonably safe products, particularly when those products are widely used by children and teenagers. It also alleges the companies failed to adequately disclose known risks associated with prolonged or compulsive use.

Under this approach, social media platforms are treated like other consumer products. The filings argue that when a product’s design encourages harmful use patterns without appropriate safeguards or warnings, the companies responsible for that design can be held accountable for resulting injuries.

Lawyers involved in the case have compared the lawsuit to earlier public health cases involving tobacco and opioid manufacturers, where courts allowed claims to proceed based on allegations tied to product design and corporate decision-making. While the social media lawsuits involve different facts and legal standards, the comparisons raise broader questions about corporate responsibility for public health harms linked to widely used products.

With the settlements by Snapchat and TikTok, the case now moves forward against Meta Platforms, which owns Instagram and Facebook, and Google, which owns YouTube. Those companies have not settled and have denied the allegations, arguing that users make personal choices about how they use social media, that parents play a central role in managing children’s online activity, and that federal law limits their liability.

Share This Article

If you found this article insightful, consider sharing it with your network.

Alexandra Agraz
Alexandra Agraz is a former Diplomatic Aide with firsthand experience in facilitating high-level international events, including the signing of critical economic and political agreements between the United States and Mexico. She holds dual associate degrees in Humanities, Social and Political Sciences, and Film, blending a diverse academic background in diplomacy, culture, and storytelling. This unique combination enables her to provide nuanced perspectives on global relations and cultural narratives.