A jury in Los Angeles has found Goal and Google responsible in a lawsuit filed by a woman identified in court as kaley. She claims that she developed an addiction to instagram and YouTube after using the platforms since I was little. This case is one of the first to hold social media companies liable based on how their platforms are designed, rather than user-generated content.
Jurors concluded that both companies acted negligently and failed to warn users about the risks of prolonged use of the platform. They also determined that certain design features, such as Recommendation systems, push notifications and autoplay optionsplayed a role in the plaintiff’s reported mental health issues.
Compensation awarded in the case of addiction to Meta and Google social networks
The jury awarded three million dollars in compensatory damages and supported additional punitive damages. Responsibility was shared between the two companies and Meta received the lion’s share. Final amounts for punitive damages have not been confirmed in the source material.
In a separate case, a jury in New Mexico ordered Meta to pay 375 million dollars after finding violations related to child safety protections. Mark Zuckerberg testified during the Los Angeles trial and internal company documents were presented as evidence.
Legal arguments of both parties in the Los Angeles trial
The plaintiff’s legal team argued that the platforms were designed to promote compulsive use, making it difficult for younger users to disconnect. Kaley shared her experience of developing body dysmorphia, depression, and suicidal thoughts after years of near-constant use of both platforms.
Lawyers representing Meta and Google responded by saying the plaintiff’s mental health issues were more related to her personal circumstances than to her use of the platform. They also questioned the classification of social media addiction as an officially recognized medical condition.
Broader legal context and what the verdict could mean next
The case is part of a broader legal strategy focused on how platforms are designed, rather than what users post. This approach appears to circumvent legal protections under Section 230 of the Communications Decency Actwhich generally protects platforms from liability for third-party content.
Hundreds of similar cases are pending across the United States, brought by parents, school districts and state officials. Some previous cases related to TikTok and plugin were resolved before this trial was over.
The verdict in Los Angeles could influence how courts in other ongoing cases view the connection between platform design choices and user harm. Meta and Google have not yet stated whether they plan to appeal the ruling.






