An Italian parents’ group and several families initiated a lawsuit against Meta and TikTok on Thursday, with the first court hearing held in Milan. The legal action seeks to impose stricter controls on minors’ access to social media platforms, specifically targeting Facebook, Instagram, and TikTok.
Legal Challenge Against Social Media Giants
The class injunctive action, brought forth by MOIGE (an Italian parents’ movement) and a group of families, is being heard before Milan’s business court. The core of the lawsuit revolves around demanding that social media platforms implement more robust age-verification systems for users under the age of 14.
Furthermore, the plaintiffs are calling for the removal of algorithms perceived as manipulative and for greater transparency regarding the potential negative impacts of excessive social media use on young people.
MOIGE estimates that approximately 3.5 million Italian children between the ages of 7 and 14 are illegally active on these platforms, a situation the group aims to rectify.
Platform Responses and Defense
TikTok stated that the litigation is ongoing and emphasized its commitment to rigorously enforcing its Community Guidelines. The platform claims to proactively remove over 99% of content that violates these rules, particularly those concerning mental and behavioral health.
A spokesperson for TikTok added, “We also continue to invest in safety measures to diversify recommended content, block potentially harmful searches and connect vulnerable users with available support resources.”
Meta, on the other hand, expressed strong disagreement with MOIGE’s allegations. The company highlighted its ongoing efforts to enhance online safety for teenagers, citing features like Teen Accounts and their associated safeguards.
“We know parents worry about the safety of their teens online, which is why we’re consistently making changes to help protect teens,” Meta stated. “We stand by our record and will continue to do more to keep young people safe.”
Jurisdictional Disputes and Broader EU Context
During the initial hearing, lawyers representing Meta and TikTok reportedly raised preliminary objections. These objections disputed the competence and jurisdiction of Italian courts to preside over the case.
The defense also challenged new documents presented by MOIGE’s legal team. These documents, according to MOIGE, demonstrate the companies’ awareness of the potential adverse effects of their algorithms on minors, including design elements intended to boost user engagement.
MOIGE’s lawyers countered these objections, arguing that Italian courts possess full jurisdiction. They characterized the issue as a matter of public health and urged the court to expedite the proceedings due to the alleged risks faced by children.
The court is expected to establish a schedule for subsequent hearings at a later date.
This legal battle unfolds as the European Union signals its intent to address harmful social media practices. European Commission President Ursula von der Leyen recently announced that the EU executive plans to target addictive and harmful design features employed by social media firms through its forthcoming Digital Fairness Act.
Similar regulatory movements are occurring globally, with countries like Australia, France, and Greece also considering or implementing measures. Spain, for instance, announced plans in February to ban social media use for teenagers.
Future Outlook and Implications
The outcome of the Milan lawsuit could set a significant precedent for how social media platforms are regulated concerning minors across Europe. The focus on age verification, algorithmic transparency, and protection against addictive designs highlights a growing global concern over the impact of digital platforms on young, vulnerable users.
As courts deliberate and legislative bodies propose new regulations, parents, industry stakeholders, and policymakers will be closely watching developments. The case underscores the ongoing tension between technological innovation, corporate responsibility, and the imperative to safeguard children in an increasingly digital world. Further hearings will likely delve deeper into the technical aspects of platform design and the extent of corporate knowledge regarding potential harms.











Leave a Reply