In March of this year, a court ruled that the well-known tech companies Meta and Google must pay $6 million in personal injury damages. Following this landmark judgment, a significant case that will shape the future is set to be heard in a California state court. The court will examine whether tech giants should be held responsible for the damages caused by the addictive features of their platforms, bringing focus back to some familiar litigation reasons.
Just like the 20-year-old Californian woman referred to as “Kaley GM” or “KGM” in court documents (who sued Instagram, YouTube, TikTok, Snapchat, and their parent companies), the second plaintiff “RKC” also claims to have developed anxiety disorders, depression, and body dysmorphic disorder after becoming addicted to these platforms. He is suing the same four companies as the first plaintiff.
Both lawsuits accuse these social media companies of designing products with the intention to hook vulnerable young people and trap them in a vicious cycle of overusing the products, despite being aware of the harm these products can cause.
It is worth noting that the second plaintiff’s lawyer pointed out that RKC is a 17-year-old African American boy from Panama City, Florida, whose perspective differs from that of the young white woman, Kaley. However, there are similarities between their experiences. Josh Autry, a lawyer from Morgan & Morgan representing RKC, told the Epoch Times, “His experience is similar but different. He’s a kid, a teenager, still struggling with social media addiction.”
Autry mentioned, “He is not from California. He is from Florida, on the other side of the country. He rarely travels, and I think this trial might be his first time in California. He is a minority member. I think all of that is important.”
In this pivotal process, a few test cases will serve as a blueprint for the debates and resolutions of thousands of other cases. It will be crucial for the jury to see examples of both male and female plaintiffs, young people and teenagers growing up in the social media era from various backgrounds.
While much research and attention have focused on the negative impacts of social media on girls, the perspective of male teenagers adds a new dimension to the ongoing discussions, with some observers calling it a “Big Tobacco moment” for major tech companies – a high-stakes debate on the industry’s future unfolding in groundbreaking court trials and public opinion.
In his 2023 lawsuit, RKC claimed that due to the “defendant platforms’ addictive designs” and “persistent notifications,” he became addicted to the platforms and almost entirely abandoned other activities. Subsequently, he experienced sleep deprivation, depression, eating disorders, and suicidal tendencies.
The lawsuit alleges that the defendants failed to warn RKC and his father of the dangers of compulsive use and made false statements about the “safety, utility, and non-addictiveness” of their products.
The focus of these civil lawsuits primarily revolves around the design and operation of the defendant platforms, including features like infinite scrolling, beauty filters, and AI-driven proprietary algorithms, rather than the user-generated content the platforms may host. Legally, the First Amendment to the United States Constitution and Section 230 of the Communications Decency Act of 1996 protect platforms from liability for third-party content.
These cases are just a fraction of the thousands of civil lawsuits, including suits brought by school districts and attorneys general nationwide, which have taken years to reach the trial stage. Just a day before the judgment in Kaley’s case, a jury in New Mexico found Meta guilty of misleading consumers, falsely claiming their products were safe, harming child safety, and ordered the company to pay a $375 million civil penalty.
Executives from Meta and Google, including Meta CEO Mark Zuckerberg, testified during Kaley’s trial, stating that their products are not designed to be addictive, denying the existence of clinically significant social media addiction, and claiming the companies have taken overly cautious approaches to safety. Google is the parent company of YouTube.
During Kaley’s trial, expert witnesses pointed out the explosive growth of social media addiction among young people, with its effects on the brain being akin to substance abuse. On the other hand, some experts mentioned the lack of consensus among institutions regarding diagnosis, treatment, and the real extent of this phenomenon. Meta’s lawyers aggressively delved into Kaley’s sensitive medical records and disturbing family history, arguing that her mental health issues were more likely caused by genetic factors or parental abuse or neglect, rather than compulsive social media use.
The jury seemed unconvinced by this argument, ruling that Meta should bear 70% of the compensatory and punitive damages for Kaley. YouTube was ultimately held responsible for 30% of the damages. YouTube denied any negligence on its part and argued that its platform is not a social media application but a video streaming platform akin to Netflix. Both companies released statements expressing discontent with the judgment and planning to appeal.
Autry said he expects the defendants’ legal strategies not to change in future cases. “I think this will be their primary defense in every trial. They will say these kids would have terrible lives anyway, with anxiety, depression, and suicidal tendencies.”
For RKC, the debate over parental control of teenage social media use will be a real issue. “I think in this case, it’s crucial to receive a landmark judgment on parents being blamed for giving a phone to a child instead of confiscating the phone or taking more restrictive measures… He is still under his father’s supervision. Having his father testify, talking about his own struggles with social media and how he struggles as a father with handling his son’s social media usage issues… all of this is very important.”
Autry mentioned that, like Kaley, RKC has also struggled with weight issues, related self-esteem troubles, and bullying; and like her, even though these applications made him feel worse, he kept coming back to them. “We use this to prove how severe their addiction is – they can’t stop themselves from going back to these platforms even if they don’t like it.”
However, unlike in Kaley’s case where the use of beauty filters on Instagram was central, for RKC, the impact of beauty filters was relatively minor. “For him, those filters were either dog ears or the Superman emblem on his chest. Apart from possibly getting a little extra dopamine hit, the filters themselves weren’t the cause of his harm – these extra features make him return to the platform, but unlike Kaley’s situation, the filters made her hate herself more,” Autry explained.
Similar to the first trial, the focus of the second trial will be on expert testimony regarding addiction science – including research on how improper use affects adolescent brain development – and internal documents revealing when the defendants became aware of and to what extent they understood this issue.
Thousands of unsealed internal company documents will allow the jury and the public to see what the leadership of Meta and Google knew about the potential harm to their underage users.
Representatives of both companies argued that the plaintiffs misconstrued and misinterpreted these documents to serve incomplete and inaccurate narratives.
Shortly before Kaley’s trial, Snapchat and TikTok settled, but these two companies are still embroiled in thousands of related cases.
Autry said that more internal documents in the upcoming trials will show that the defendants knowingly designed products that would hook children. “Meta and YouTube’s documents are not isolated incidents. We haven’t gone to trial with Snap or TikTok yet, but these documents are just as crucial. TikTok and Snap, just like Meta and YouTube, designed their platforms to maximize user engagement time and induce addiction.”
Autry specifically pointed out that TikTok’s internal documents will show the company was well aware of “safer operational methods,” referring to an application version with additional security mechanisms launched in China. “They made growth-focused business decisions prioritizing safety, while in China, they put more emphasis on protecting children.”
Representatives from Snap, Meta, ByteDance, and Google were all unavailable to respond to inquiries from The Epoch Times.
In 2023, High Court Justice Carolyn Kuhl ruled that California does not allow claims of “strict liability,” meaning Kaley’s lawyers had to prove that these companies knew their products were defective and were negligent.
The plaintiff’s lawyers in the upcoming trial stated that Florida allows strict liability, meaning they still need to prove the platforms’ designs are defective and dangerous, but do not have to prove the companies were aware of such defects.
The plaintiff’s lawyers expressed their desire to schedule the trial for the first available time this summer with Judge Kuhl, but some defendants oppose a summer trial.
Kuhl is expected to make a decision on these two issues later this month.
