Social Media Firms Face Legal Scrutiny Over Youth Harm Claims
Meta Platforms, TikTok, Alphabet’s Google, and YouTube faced intense courtroom scrutiny this week as allegations mounted that their platforms contribute to a growing youth mental health crisis. Consequently, the broader national debate around children’s screen time has entered a more serious and legally significant phase.
Jury Findings in Los Angeles Case
A Los Angeles jury ruled on March 25 that Meta and Google must pay a combined $6 million in damages to plaintiff Kaley G.M., a 20-year-old. She argued that she developed depression and experienced suicidal thoughts after becoming addicted to the companies’ platforms at a young age. According to the jury, both companies acted negligently in designing features that encouraged prolonged use. Furthermore, they failed to provide adequate warnings about potential risks associated with their platforms.
New Mexico Verdict Against Meta
In a separate case, a New Mexico jury ordered Meta on March 24 to pay $375 million in damages. The court determined that the company misled users about the safety of Facebook and Instagram. Moreover, the jury found that these platforms enabled child sexual exploitation. This lawsuit was brought forward by the state’s attorney general, highlighting serious concerns about user protection.
Broader Legal Implications
These trials represent a critical turning point. For the first time, courts are examining whether major technology companies can be held legally responsible for the design of applications linked to harm among young users. At present, Meta, Snapchat, Google’s YouTube, and TikTok, along with its parent company ByteDance, face thousands of lawsuits. These claims argue that the companies knowingly implemented addictive features targeting children and teenagers, thereby worsening mental health outcomes.
Growing Legal and Global Backlash
In addition to individual cases, over 2,300 lawsuits have been filed in federal courts by parents, school districts, and state authorities. These cases collectively underscore mounting dissatisfaction with how social media platforms operate. While company representatives insist they have implemented safety measures, critics argue these efforts remain insufficient.
Meanwhile, the backlash extends beyond the United States. Australia has introduced a ban preventing users under 16 from accessing social media platforms. However, early data indicates that enforcement remains challenging. Despite the restriction, more than 20 percent of teenagers under 16 continued using platforms such as TikTok and Snapchat within two months of the ban.
Although usage among younger teens declined after the ban’s implementation, a significant proportion remained active. Interestingly, fears that teenagers would shift to unregulated platforms have not materialised. Instead, some platforms like WhatsApp have seen a slight increase in usage among this age group.
Ongoing Concerns Over Age Restrictions
Social media companies maintain that users must be at least 13 years old to register. Nevertheless, child protection advocates argue that these safeguards are inadequate. Supporting this concern, official data from several European countries shows that many children under 13 still maintain active accounts.
With inputs from Reuters

