FEATURE | SOCIAL MEDIA & THE LAW
On a wet Tuesday morning in Los Angeles, Mark Zuckerberg — the 41-year-old CEO of Meta and perhaps the most consequential architect of the modern internet — sat before a jury for the first time in his career and faced a question that parents, psychiatrists, and regulators around the world have been asking for years: did you build these platforms to addict children?
The trial, the first of its kind to reach a full jury proceeding in the United States, concerns one California woman’s allegations that Instagram and YouTube deliberately engineered their recommendation algorithms and interface features to foster compulsive use among minors. Its legal outcome will be determined in Los Angeles. But its implications reverberate far beyond California — and nowhere more keenly than in Singapore, where the government has spent the past two years constructing its own framework to rein in the harms of social media on youth.
“The apps are made to keep you hooked” — words spoken not by a plaintiff’s attorney, but by a teenager in a 2025 NTU Singapore study on social media’s impact on youth.
What is unfolding in that Los Angeles courtroom is, in essence, the first adversarial test of a theory that Singapore’s regulators have already quietly accepted as true: that the design choices embedded in major platforms — their infinite scrolls, their dopamine-calibrated notification systems, their algorithmically curated feeds — are not neutral features but deliberate mechanisms with foreseeable psychological consequences, especially for young and developing minds.
The Architecture of Addiction
At the heart of the California proceedings is a distinction that has significant legal and regulatory weight. US law, through Section 230 of the Communications Decency Act, grants platforms near-complete immunity for user-generated content. What it does not protect is product design. The plaintiff’s legal strategy routes around this immunity by focusing not on what users posted on Instagram or YouTube, but on how those platforms were engineered to maximise time-on-platform — and what those engineers knew about the effects of those choices on adolescent neurodevelopment.
TikTok and Snapchat, also named in the original complaint, reached confidential pre-trial settlements. Their departure from the courtroom is telling. Instagram’s head, Adam Mosseri, who testified on 11 February, took the stand and rejected the language of addiction in favour of Meta’s preferred term: ‘problematic use’. The distinction is not merely semantic. Clinical addiction, if established before a jury, carries a burden of foreseeability and intent that could reshape how thousands of similar lawsuits are adjudicated. ‘Problematic use’ is a softer frame — more individual, less systemic, and far less amenable to corporate liability.
Internal email exchanges shown to jurors, however, revealed a more complicated picture. Mosseri defended a 2020 decision by Zuckerberg to permit cosmetic surgery filters on Instagram — filters that preview how procedures would alter a user’s appearance — despite explicit internal warnings from other executives about their potential to harm young girls’ body image. The stated reason for proceeding: competitive pressure from TikTok.
Singapore Is Watching — and Acting
For observers in Singapore’s Ministry of Digital Development and Information (MDDI) and the Infocomm Media Development Authority (IMDA), the Los Angeles trial arrives at an inflection point in domestic policy. Singapore has been constructing a layered regulatory architecture for online safety since at least 2023, and the trial’s evidence — particularly the internal corporate communications — could accelerate that process considerably.
Singapore’s Code of Practice for Online Safety for Social Media Services, which came into effect in July 2023 under the Broadcasting Act, already requires designated platforms including Facebook, Instagram, YouTube, TikTok, and Twitter to implement enhanced protections for users under 18, deploy proactive content detection systems, and submit annual safety reports to the IMDA. Non-compliance can result in fines of up to S$1 million and, in egregious cases, service blocking. Singapore was among the first jurisdictions globally to impose such obligations.
A second regulatory layer followed in March 2025, when the IMDA’s Code of Practice for Online Safety for App Distribution Services came into force. This code — which targets app stores operated by Apple, Google, Huawei, Microsoft, and Samsung — requires age assurance mechanisms to be implemented at the point of download, using tools such as AI-assisted facial analysis, credit card verification, or Singapore’s national digital identity system, SingPass. App stores that had not yet deployed verified age-checking systems were required to submit implementation plans and timelines for IMDA approval.
Singapore is not merely regulating content. It is beginning to regulate the structural conditions under which children encounter platforms — a posture that more closely resembles the logic of the plaintiff in Los Angeles than it does the defence.
The government is also considering whether these age assurance obligations, currently binding on app stores, should be extended directly to social media platforms themselves. In August 2025, MDDI indicated it was exploring this extension and would initiate discussions with designated platforms. An Online Safety Commission — a new government agency dedicated to assisting victims of online harm — is expected to become operational in the first half of 2026.
The Local Evidence Base
The regulatory trajectory is not occurring in an evidential vacuum. Research conducted in Singapore provides a sobering empirical foundation for the legislative momentum. A 2025 study by Nanyang Technological University, involving researchers from Singapore and Australia, found that prolonged social media use among teenagers is associated with diminished capacity for sustained attention, increased emotional fatigue, and behavioural patterns consistent with addiction criteria. Crucially, the study found that Singaporean teenagers were more likely than their Australian counterparts to credit in-school phone restrictions by the Ministry of Education for reducing compulsive usage — an observation that suggests structural interventions have measurable behavioural effects.
A broader body of research on Singaporean students presents figures that are difficult to dismiss. Among students exhibiting social media addiction patterns, approximately 27 per cent experience anxiety and 21 per cent suffer from depression, with female students disproportionately represented across both categories. Research on university students has found moderate positive correlations between problematic smartphone use and depressive symptoms and perceived stress. A PubMed-indexed study examining college students in Singapore found high prevalence rates of social networking site addiction and noted elevated vulnerability to co-occurring behavioural addictions and affective disorders, again with a pronounced gender differential.
Dr Adrian Loh, a Senior Consultant Psychiatrist at Promises Healthcare, has publicly stressed that addictive behaviours at a young age are a compounding concern. The relative novelty of social media — barely a decade old in its current algorithmic form — means that the downstream effects are still being understood. ‘We are still trying to understand downstream implications,’ he has noted. The Los Angeles trial, and the corporate communications it is surfacing, may begin to fill in those gaps in ways that peer-reviewed research alone cannot.
The Precedential Stakes for Asian Regulators
The California proceedings are explicitly designed as bellwether litigation. Legal scholars refer to these as test cases: a small number of carefully selected suits tried to verdict to establish the contours of liability, before the remaining thousands of filed lawsuits are resolved through settlement or further litigation. The tobacco class actions of the 1990s followed a similar structure, with the first successful plaintiff’s verdicts triggering a cascade of multibillion-dollar settlements that restructured an entire industry.
If a California jury returns a verdict for the plaintiff — finding that Meta’s and Google’s platform design choices constitute actionable harm — the decision would not bind courts in Singapore or elsewhere in Asia. But its effects would be felt in several other ways. First, internal corporate documents admitted into evidence become part of a permanent public record, available to regulators and plaintiffs globally. Second, a plaintiff’s verdict would dramatically increase the settlement value of the remaining thousands of US cases, incentivising major payouts and, with them, industry-wide design changes to avoid future liability. Third, the factual findings of the jury — whether platforms knew of harms, whether internal warnings were suppressed for commercial reasons — would provide a factual template that regulators in Singapore and across Southeast Asia could cite in their own proceedings and legislative debates.
Australia has already moved further than Singapore on outright age restrictions, implementing a ban on social media use for children under 16 in late 2025. Malaysia announced plans for a comparable under-16 restriction to take effect in 2026. Indonesia is studying the options. Singapore has thus far taken what officials describe as a ‘collaborative’ approach — working with platforms rather than imposing categorical prohibitions — but the Los Angeles evidence may test the patience of that posture.
A plaintiff’s verdict would not bind Singapore courts. But the internal documents it surfaces — emails showing that executives knew and proceeded anyway — would land on the desks of every regulator in the region.
What Parents and Educators Should Know
Beyond the regulatory and legal dimensions, the trial carries direct significance for the families and educators navigating these questions in real time. For many Singaporean parents, the proceedings in Los Angeles are not an abstraction. They describe, in evidence-backed detail, the specific mechanisms by which platforms retain users: notification timing designed to exploit moments of psychological vulnerability, recommendation algorithms calibrated to maximise emotional arousal, and cosmetic filters that create aspirational gaps between self-image and platform-mediated appearance.
The NTU study found that 65 per cent of surveyed teenagers believe their current digital habits could impair their ability to study or work in later life. Sixty per cent of parents expressed high concern about the impact of social media on their children’s attention, behaviour, and emotional health. One teenager in the study described the dynamic with an economy that no academic paper quite matches: ‘It’s like the apps are made to keep you hooked.’
Those words are, in their way, the plaintiff’s entire case in a sentence. They are also a description of a design intention that the Los Angeles jury will now be asked to weigh, in public, with Zuckerberg himself on the stand. Whatever verdict emerges, the conversation it generates — about corporate knowledge, about the ethics of algorithmic design, about what society owes its youngest users — has arrived in Singapore, and it is not leaving.
Key Developments: Singapore’s Online Safety Architecture
July 2023 · Code of Practice for Online Safety for Social Media Services comes into force; platforms required to implement enhanced restrictions for under-18 users and submit annual safety reports to IMDA.
March 2025 · Code of Practice for Online Safety for App Distribution Services takes effect; major app stores required to deploy age assurance mechanisms.
August 2025 · MDDI announces consideration of extending age assurance obligations to social media platforms directly; consultations with designated services to begin.
H1 2026 · Online Safety Commission expected to become operational, providing support to victims of online harm and overseeing compliance.
Late 2026 · Designated social media services to be engaged on adoption of age assurance frameworks, complementing existing Code of Practice obligations.
If you or someone you know is struggling with issues related to social media use, anxiety, or depression, the Samaritans of Singapore can be reached 24 hours a day on 1-767. The National Addiction Management Services helpline is available at 6732 6837.