Recently, Snapchat settled a social media addiction lawsuit in Los Angeles, California. The lawsuit was instituted by a 19-year-old who accused the app of designing algorithms and features that led to his addiction and consequent mental health issues.
According to the New York Times, lawyers representing the teenager alleged that the social media platforms obscured information about potential harms to their users. They argued that features like infinite scroll, auto video play, and algorithmic recommendations have tricked users into continuously using apps, leading to depression, eating disorders, and self-harm.
Now, Snap wasn’t the only social platform sued in the addiction case; other platforms, including Meta (Facebook and Instagram), TikTok, and even YouTube, were joined in the suit. However, only Snap appeared to have caved, apparently because its employees provided evidence dating as far back as nine years, indicating that they raised concerns about the risk of its algorithm to the mental health of teenagers.
They drew parallels to Big Tobacco — referring to lawsuits in the 1990s against cigarette companies that concealed health risks.
Here is the big question: are social media companies to be blamed for teenagers’ social media addiction?
Addiction and its effect on mental health are psychological problems.
And psychologists generally agree that no single entity is responsible for addiction as it is a product of individual, social and psychological factors. Thus, while individuals are exposed to the addictive substance, or in this case, the media, they are also heavily influenced by other factors.
These include peer pressure, poor quality of life, trauma, stress, depression and other mental health issues, early exposure to social media, and financial gains. The availability and acceptance of social platforms deepens addiction as they have quickly become part of everyday life and culture across the world.
The question then is: if several factors are responsible for addiction, why are social media companies getting the flak for social media addiction alone? It is akin to holding breweries for alcohol addiction, or cigarette companies held liable for smoking addiction.
Perhaps since this case revolves around teenagers still considered minors, one could understand why responsibility should not fall upon them alone. Yet, whatever happened to other entities that are tasked with protecting minors: Parental control, familial support, and governmental protection?
These are entities that can control, if not eliminate, exposure. Why compel the social media companies alone?
It is important to note that several countries are taking steps to limit access to social media for young people. In December 2025, Australia became the first country in the world to ban social media for children under the age of 16.
The platforms include TikTok, Alphabet’s Google and YouTube, and Meta’s Instagram and Facebook. Platforms that fail to comply could face penalties of up to $33.3 million (49.5 million Australian dollars).
Subsequently, Malaysia banned social media for minors in 2026. The government is developing codes that social media platforms like Facebook, Instagram, and X will follow. The restriction will prevent users under the age of 16 from creating social media accounts.
Though France passed a law requiring parental consent for children under-15, reports suggest it isn’t well implemented due to technical challenges. The case is different in Germany, where minors between 13 and 16 require parental consent to use social media. While the regulation is in full force, advocates say the controls are inadequate.
The UK is plotting an Australia-style ban for minors. Indeed, the ban might become broader as there are arguments that the age of 16 is too low to be impactful.
In a nutshell, countries are taking steps to protect their young ones from early exposure through limited access and controlled exposure. This sounds like the most responsible thing to do. Nonetheless, it does not absolve the social media companies from responsibility.
To be fair, social media companies are also taking some measures.
TikTok, for instance, introduced tools that allow users to control their experience, manage exposure to certain content types, filter specific words and totally avoid content that may be detrimental to their mental health.
TikTok Digital Well-being Ambassadors for SSA
TikTok also introduced family pairing tools to allow parents to control their kids’ exposure on the platform, plan sleep time and allow users to control who can watch and comment on their videos.
These are available for young users aged 13 to 15 years old. Other platforms like YouTube have a separate platform for children, like YouTube Kids, which gives parents full control of their children’s experiences.
Yet, it seems like prosecutors are keen on looking beyond these measures. They instead focus on core features like infinite scroll, automatic video play, algorithmic recommendations, and push notifications as the culprits and demand their removal.
On the contrary, social media companies defend themselves by arguing that features like algorithmic recommendations, push notifications, and infinite scroll are similar to a newspaper deciding what stories to publish and are protected speech under the First Amendment.
Coupled with the fact that no platform has ever lost a social media addiction lawsuit, the companies have every reason to be positive. A loss, however, would mean billions of dollars doled out in penalties. If that were the outcome remains to be seen.
The post Addiction: Are platforms like Facebook, YouTube and TikTok entirely to blame first appeared on Technext.


