Social Media Prohibition for Minors in Karnataka and Andhra Pradesh: Constitutional Competence, Federalism, and Digital Rights

On March 7, 2026, Karnataka Chief Minister Siddaramaiah, during the presentation of the State Budget for 2026-27, announced that Karnataka proposes to ban social media use for children under 16 years of age. On the same day, Andhra Pradesh Chief Minister N. Chandrababu Naidu announced a similar prohibition for children below 13, with a 90-day implementation window. The simultaneous announcements by two State governments have placed the question of social media regulation for children at the centre of a complex constitutional debate about federal competence—specifically, whether State legislatures have the authority to regulate digital intermediaries operating under the Information Technology Act, 2000, which is a Central law.

This issue has emerged in a global context where Australia became the first country to ban social media for children below 16 in December 2025, with penalties up to 32 million Australian dollars for serious violations. Indonesia announced a similar ban for children under 16 on March 7, 2026. The question of whether and how governments should regulate children’s access to social media platforms has become one of the defining governance challenges of the digital age, involving children’s rights, mental health, platform accountability, and the boundaries of state power in a networked world.

đź’ˇ Get AI-powered exam prep on your phone!

Download ExamYaari App

For UPSC aspirants, this issue spans multiple analytical dimensions: federalism and the division of legislative powers under the Seventh Schedule; fundamental rights of children under Articles 21 and 45; the constitutionality of digital regulation; and the broader policy debate about the state’s role in protecting vulnerable populations from algorithmic harm. The challenge of implementation—how do State governments enforce age verification on platforms like Instagram, YouTube, and TikTok that operate entirely online—raises important questions about governance capacity and institutional design.

Background and Context

Five Important Key Points
1. The Information Technology Act of 2000 and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules of 2021 constitute the primary legal framework governing digital intermediaries in India, and both are Central laws under Entry 31 of the Union List dealing with ‘posts and telegraphs, telephones, wireless, broadcasting and other like forms of communication.’
2. Australia’s Online Safety Amendment (Social Media Minimum Age) Act 2023, enacted in December 2025, is the world’s first legislative social media age ban and imposes obligations on platforms rather than individual users or parents, placing the compliance burden on technology companies.
3. Research published by the U.S. Surgeon General’s Advisory in 2023 found that adolescents who spend more than three hours per day on social media face double the risk of depression and anxiety symptoms, providing the primary public health justification for regulatory intervention.
4. Meta, which operates Facebook, Instagram, and WhatsApp, has stated it will comply with bans ‘where they are enforced’ but argues that similar protections should apply to all apps children access, not just social media, raising questions about regulatory selectivity.
5. India’s Union Minister for Electronics and Information Technology Ashwini Vaishnaw indicated in early 2026 that the Central government was discussing age-based restrictions on social media use but had not committed to a specific implementation timeline.

Constitutional Framework: Division of Powers and the Federalism Question

The constitutional challenge to any State-level social media ban is rooted in Articles 245, 246, and the Seventh Schedule of the Constitution of India. Entry 31 of the Union List assigns legislative authority over ‘posts and telegraphs, telephones, wireless, broadcasting and other like forms of communication’ to Parliament. The Supreme Court in Shreya Singhal v. Union of India (2015) affirmed that internet regulation falls under Parliament’s domain when it struck down Section 66A of the IT Act as unconstitutional.

When a State law on a subject covered by the Union List conflicts with a Central law under Article 254, the Central law prevails. The IT Act of 2000 and the IT Rules of 2021 already create a comprehensive regulatory framework for digital intermediaries, including provisions relating to content moderation, grievance redressal, and age-appropriate content. Any State legislation that imposes additional obligations or restrictions on digital platforms—which are registered under Central law—would face a serious constitutional challenge on the grounds of legislative incompetence.

However, States are not entirely without ammunition. Under Entries 1 (public order), 6 (public health and sanitation), and 12 (education) of the State List, States can argue that social media bans serve legitimate state interest in protecting public health and child welfare. The question of constitutional fit, as noted by digital rights scholars, becomes particularly contested when State measures operate directly on digital intermediaries rather than through the education or public health apparatus.

Fundamental Rights Dimension: Children’s Rights and the Right to Information

The proposed ban raises important questions under Article 21 (right to life and personal liberty, which the Supreme Court has interpreted to include the right to information and right to education in the digital age), Article 19(1)(a) (freedom of speech and expression), and Article 19(1)(g) (freedom to practice any profession or carry on any occupation). The Supreme Court in Justice K.S. Puttaswamy v. Union of India (2017) established a nine-judge bench ruling that privacy is a fundamental right—a ruling that cuts both ways in the social media debate, as it can be invoked to protect children’s privacy from algorithmic profiling or to resist government overreach into personal digital choices.

Children’s rights in India are also governed by the Protection of Children from Sexual Offences (POCSO) Act, 2012, the Juvenile Justice Act, 2015, and various international obligations under the United Nations Convention on the Rights of the Child (UNCRC), to which India is a signatory. The UNCRC requires that state measures affecting children must be guided by the ‘best interests of the child’ and must not disproportionately restrict children’s rights to information and participation in public life.

Implementation Challenges and Governance Capacity

The most fundamental challenge to a social media ban for minors is enforcement. Social media platforms currently rely on self-declaration of age, and the technical infrastructure for robust age verification does not exist in India at scale. Building reliable age verification systems raises its own concerns about data privacy—requiring children or their guardians to submit identity documents to private platforms creates new risks of data breach and misuse.

The Internet Freedom Foundation has argued that blanket social media bans are a ‘disproportionate response that can do more harm than good’ because they restrict children’s right to information and expression while failing to address root causes—including platform design choices that maximise engagement over safety and inadequate digital literacy infrastructure. Medical experts, including Dr. Rakshay Shetty of Rainbow Children’s Hospital, have similarly cautioned that blanket bans may be counterproductive and could remain ‘paper tigers’ with no effective enforcement.

Comparative Analysis: Global Approaches

The global regulatory landscape reveals a spectrum of approaches. Australia’s 2025 law is the most prohibitive, placing compliance obligations on platforms rather than users. The United Kingdom’s Age Appropriate Design Code (Children’s Code) takes a more nuanced approach, requiring platforms to design their services with children’s best interests in mind without prohibiting access. France’s 2023 law requires parental consent for children under 15 to access social media, while the European Union’s Digital Services Act requires risk assessments and mitigation measures for platforms likely to be accessed by children.

The evidence base for blanket bans is mixed. Longitudinal research from the United States and the United Kingdom suggests that screen time alone is not determinative of mental health outcomes; what matters more is the nature of interaction, the displacement of sleep and physical activity, and exposure to harmful content. This nuance is lost in blanket prohibition approaches.

Way Forward

The Central government should develop a comprehensive National Digital Safety for Children framework that places legally binding obligations on platforms, mandates default privacy settings for users below 18, and requires algorithmic transparency for content served to minors. Rather than age-gating entire platforms, India should consider a tiered approach: mandatory parental consent for children below 13, robust safety-by-design requirements for the 13-16 age group, and digital literacy programmes integrated into the school curriculum at the national level. A dedicated regulatory body—perhaps an expanded mandate for the Data Protection Board under the Digital Personal Data Protection Act, 2023—could oversee compliance.

Relevance for UPSC and SSC Examinations

GS Paper II: Government policies and interventions for development in various sectors; issues relating to development and management of education; federalism and centre-state relations. GS Paper III: Role of media and social networking sites; awareness in IT and cyber security. GS Paper IV: Ethics in public policy; rights and responsibilities in the digital age.

SSC Examinations: Digital India, child rights, internet governance, IT Act. Key terms: Article 254, Shreya Singhal case, IT Act 2000, IT Rules 2021, POCSO, UNCRC, Age Appropriate Design Code, Digital Personal Data Protection Act 2023, algorithmic accountability, Data Protection Board.

Leave a Comment