All Concepts
Human Rights

Digital Rights and Online Safety

What digital rights are, why privacy matters online, how the internet has changed human rights, and how to stay safe online without giving up freedom.

Core Ideas
1 Some things are private and belong only to us
2 It is okay to ask a grown-up if you are not sure online
3 Kind words online matter just as much as kind words in person
4 Not everyone online is who they say they are
5 Spending time off screens is good
Background for Teachers

Young children increasingly grow up with screens and apps from very early ages. Even if they are not yet online themselves, they see adults and older children using devices constantly. The core instincts to build at this age are: some things about you are private and should stay private; when you are online, you should still be kind to others; not everyone online is who they seem; a trusted adult is the right person to ask if something worries you. Children do not need technical vocabulary. But building early instincts about privacy, kindness, and trust prepares them for safer online lives later. Be sensitive — children may see things on screens their parents do not know about. The goal is building safety through instinct, not frightening children. No materials are needed.

Classroom Activities
Activity 1 — What is private?
PurposeChildren understand that some things about them belong to them and should not be shared with everyone.
How to run itAsk the children: are there some things about you that are private? What? Collect answers. Prompts: your body, your home address, where you go to school, your family's phone number, a secret a friend shared. Explain: these are things that belong to you. You can share them if you choose, with people you trust. But they are not for everyone to know. This is called privacy. Now ask: does the same apply online? Yes — even more. When you share something online, many people you do not know might see it. Things on the internet are very hard to take back. So before you share anything, think: is this something I want everyone to know? If not, keep it private.
💡 Low-resource tipDiscussion only. No materials needed.
Activity 2 — Kind words, kind world
PurposeChildren understand that online unkindness is still real unkindness.
How to run itTell a simple story: a child writes mean words about a classmate in a class group chat. The classmate sees them. They feel upset. Ask: was this okay? The words were written on a screen, not spoken aloud. Does that make a difference? Discuss: words on a screen are still real words. The other person reads them with real eyes and feels real feelings. Mean words online are still mean. In fact, screen words can be worse because: (1) many people can see them; (2) they can last forever; (3) the writer often forgets that a real person is reading. Ask: what are three kind things you could say to someone online?
💡 Low-resource tipTell the story verbally. No materials needed.
Activity 3 — Ask a trusted grown-up
PurposeChildren learn that trusted adults are the right people to ask about online problems.
How to run itDescribe some scenarios. For each, ask: what should the child do? (1) A stranger online asks to be friends and wants to know where you live. (2) Someone you do not know sends you pictures that feel uncomfortable. (3) A game you play keeps asking for money. (4) Someone says mean things about you online and you feel sad. (5) You see something on a screen that scares or confuses you. Discuss: in each case, the best answer is to tell a grown-up you trust — a parent, a teacher, a grandparent. You will not be in trouble. Real grown-ups who care about you want to help, even when the problem started on a screen. Keeping it secret makes it worse.
💡 Low-resource tipDiscussion only. No materials needed. Handle with care.
Discussion Questions
  • Q1What is something about you that is private?
  • Q2If someone says something mean online, how do you think it feels to read it?
  • Q3Who is a grown-up you could tell if something online worried you?
  • Q4Is everyone online who they say they are?
  • Q5What do you like doing that does not use a screen?
Writing Tasks
Drawing task
Draw a picture of someone being kind online, and a picture of something you enjoy doing away from screens. Write or say: Online, I can be kind by ___________. Without a screen, I love ___________.
Skills: Modelling kind online behaviour and valuing offline life
Sentence completion
Some things about me should stay private, like ___________. If something online worries me, I can ___________.
Skills: Articulating privacy and safety
Common Misconceptions
Common misconception

Words on a screen do not really hurt people.

What to teach instead

Words online are just as real as spoken words. The person reading them has real feelings. In some ways, mean words online can hurt even more — because many people might see them, and they stay around for a long time. Being kind online is not a smaller thing than being kind in person. It is the same thing.

Common misconception

If someone is nice online, they must be nice in real life.

What to teach instead

Online, people can pretend to be anyone they want. A person who says they are your age might be much older. A person who seems kind might be trying to trick you. This does not mean everyone online is bad — most people are not. But until you know for sure, be careful. Never meet someone in person, give them your address, or share pictures with someone you only know from online, without a trusted grown-up knowing.

Core Ideas
1 Human rights apply online
2 Privacy and why it matters
3 Children's safety online
4 The digital divide — access and inequality
5 Cyberbullying
6 Who sees your data
Background for Teachers

Digital rights are the human rights that apply in the digital environment — online and in the wider world shaped by digital technology. The UN Human Rights Council has affirmed (since 2012) that 'the same rights that people have offline must also be protected online'. This covers freedom of expression, freedom of assembly, the right to privacy, the right to education, the right to seek and receive information, and the right to take part in cultural and political life. Digital rights are not entirely new rights — they are existing human rights applied to digital contexts.

Some key digital rights issues include

Privacy and data protection (governments and companies collect huge amounts of personal information); freedom of expression online (including the challenge of content moderation); access to the internet itself (the 'digital divide' between those who have it and those who do not); protection from surveillance (especially mass surveillance by states); online safety, particularly for children and vulnerable groups; and the right to be forgotten (to have personal data deleted). Privacy online is a central and contested right. Every device people use collects data. Apps track location, contacts, communications, purchases, and interests. This data is used for advertising, sold to third parties, and in some cases shared with governments. The scale of data collection is unprecedented in history. Major laws like the EU's General Data Protection Regulation (GDPR, 2018) have tried to give individuals more control, but in most of the world, people have limited protection.

Children online face specific risks

Contact with strangers, inappropriate content, cyberbullying, exploitation, and the collection of their personal data from very early ages. The UN Committee on the Rights of the Child issued General Comment 25 (2021) specifically on children's rights in the digital environment.

Key principles include

Children's rights apply online; design of services should account for children; parents and caregivers should have support; education on digital literacy is essential. The digital divide — unequal access to the internet — is a major issue. About two-thirds of the world's population now has internet access, but gaps remain. Access is lower in rural areas, poorer countries, among older people, and among women in many contexts. Those without access are increasingly excluded from education, work, banking, and political life. Cyberbullying — bullying that takes place online — can be relentless because it follows victims everywhere, reaches large audiences, and is difficult to escape. Research shows serious consequences for mental health, especially among young people.

Teaching note

This topic is changing rapidly, and students may know more about specific platforms than teachers do. Focus on durable principles — privacy, kindness, safety, trust — rather than specific apps that will change. Approach with care given that many children encounter difficult content online.

Key Vocabulary
Digital rights
Human rights that apply online — including privacy, free speech, access to information, and safety in the digital environment.
Privacy
The right to control what others know about you. Online, this includes your personal data, messages, photos, and internet activity.
Data
Information about you — your name, age, location, contacts, activities, preferences. Online companies and governments collect huge amounts of data from users.
Surveillance
Watching or monitoring what people do — including online. Some surveillance (like a court-approved investigation) is lawful. Mass surveillance of everyone is a serious human rights concern.
Cyberbullying
Bullying that happens online — through messages, social media posts, shared images, or group chats.
Digital divide
The gap between people who can use the internet and those who cannot — because of cost, location, age, disability, or other reasons.
Password
A secret word or phrase used to prove that an online account belongs to you. Good passwords are long, unique, and never shared.
Scam
A trick used to steal money or information — often by pretending to be a trusted organisation or person online.
Classroom Activities
Activity 1 — Who sees what?
PurposeStudents understand the scale of data collection online.
How to run itAsk students: when you use a phone or computer, who might see what you do? Collect answers. Prompt them to think beyond obvious friends and family. The company that makes the app. The company that makes the phone. The internet provider. Advertisers who pay to see patterns of what people look at. In some cases, governments. Ask: is this okay? Did you know all these people and companies could see what you do? Discuss: every app you use collects information. Some collect your messages, some your location, some your contacts, some your purchases. This information is used to sell you things, shape what you see, and sometimes for other purposes. This does not mean all use is bad — some is useful. But it does mean you are usually giving up more than you realise when you use a 'free' app. There is a saying: if an app is free, you are not the customer — you are the product.
💡 Low-resource tipTeacher writes on the board. Discussion only. No materials needed.
Activity 2 — Cyberbullying and what to do
PurposeStudents understand cyberbullying and practical responses.
How to run itDescribe what cyberbullying is. Bullying that takes place online — through messages, posts, shared pictures, group chats, or gaming platforms. It can be: mean messages sent directly; spreading false or embarrassing information; leaving someone out of group chats; sharing private photos without permission; creating fake accounts to mock someone. Ask: why is cyberbullying especially harmful? Because: it can happen at any time of day or night, anywhere the victim has a phone; it can reach many people fast; the bullying often stays online, so you can read it again and again; the victim often cannot escape — even at home. Discuss responses. If you see someone being bullied online: do not join in; say something kind to the person; tell a trusted adult. If you are being bullied online: save evidence (screenshots); tell a trusted adult; block the person; remember it is not your fault; and report to the platform. Ask: is it ever okay to respond with your own mean words? Usually not — it almost always makes things worse. Strong responses are typically calm, firm, and involve adults.
💡 Low-resource tipTeacher presents information verbally. Discussion only. No materials needed.
Activity 3 — The digital divide
PurposeStudents understand that not everyone has equal access to the digital world.
How to run itAsk students: who in your class or community has: a smartphone? A computer at home? Good internet? A quiet place to work online? Note that answers may differ greatly. Expand globally. About two-thirds of the world's population uses the internet. But one-third does not — mostly in poorer countries and rural areas. In some African countries, under 20% have internet access. In others, women have much less access than men. Older people often have less access. People with some disabilities may struggle with inaccessible designs. Ask: what does this mean for someone without internet today? Increasingly, things require digital access: schoolwork, job applications, banking, government services, health information. Without access, people are cut off from things that others take for granted. Discuss: the digital divide is not just about convenience — it is about opportunity. Closing this divide is a human rights issue, not only a technology issue.
💡 Low-resource tipDiscussion only. Use local and global examples as available. No materials needed.
Discussion Questions
  • Q1How much of your life is spent online? How does it compare to your parents or grandparents at your age?
  • Q2Who sees the information you put online? Does that change what you would share?
  • Q3What makes a password a good one?
  • Q4Why does cyberbullying often feel worse than bullying in person?
  • Q5Is it fair that some people have good internet and some do not? Whose responsibility is this?
  • Q6What is something you would want to know before sharing a photo or message online?
Writing Tasks
Task 1 — Explain and give an example
Explain why privacy matters online and give ONE example of information you would want to keep private. Write 3 to 5 sentences.
Skills: Explanation writing, understanding privacy, using examples
Task 2 — Persuasive writing
Write a paragraph (4 to 6 sentences) arguing that cyberbullying is just as serious as bullying in person, and should be addressed just as seriously.
Skills: Persuasive writing, giving reasons, understanding the consequences of online harm
Common Misconceptions
Common misconception

If something is online, it is probably free and without cost.

What to teach instead

Most 'free' online services — social media, email, games, search engines — are paid for by collecting your data and showing you ads. You are not paying with money, but you are paying with information about yourself. This is not necessarily wrong, but it is a trade: your data for the service. The saying 'if the product is free, you are the product' captures this. Understanding this trade helps you make better choices about what to use and what to share.

Common misconception

I have nothing to hide, so privacy does not matter to me.

What to teach instead

Privacy is not about hiding wrongdoing; it is about controlling who knows what about you. Even people with nothing to hide benefit from privacy — keeping medical information, finances, relationships, and private conversations private. Loss of privacy can lead to harms you did not expect: identity theft, targeted manipulation, unfair treatment, or unwanted contact. Everyone needs some privacy to develop their own life without constant judgement.

Common misconception

Deleting something online means it is gone forever.

What to teach instead

Once something is shared online, it may be copied, screenshotted, or cached by many systems. Even if you delete it from one place, copies may remain elsewhere. Anything sent to another person is out of your control. This is why thinking before sharing matters so much — you cannot always take things back. This is not a reason to never share anything, but a reason to be thoughtful.

Core Ideas
1 The principle that human rights apply online
2 Privacy as a fundamental right
3 Mass surveillance and the Snowden revelations
4 Platform power and content moderation
5 Children's digital rights
6 Digital authoritarianism
7 The digital divide and internet shutdowns
8 AI and new rights challenges
Background for Teachers

Digital rights are one of the fastest-evolving areas of human rights. Understanding the main frameworks and current debates is essential for teaching at secondary level. The principle of online/offline equivalence: the UN Human Rights Council resolved in 2012 (and repeatedly since) that 'the same rights that people have offline must also be protected online'. This is the foundational principle of digital rights — existing human rights do not disappear online but must be applied to new contexts. The challenge is that applying them requires significant reinterpretation: what does 'freedom of assembly' mean in online spaces? What does 'privacy' mean when every device collects data? What does 'freedom of expression' mean when private platforms control public discourse?

Privacy as a fundamental right

Privacy is protected by Article 12 of the UDHR and Article 17 of the ICCPR. Modern digital privacy concerns fall into two categories. Government surveillance — where states monitor citizens' communications, movements, and activities — has expanded dramatically since the 'war on terror' began in 2001. Commercial surveillance — where companies collect personal data for profit — has become the business model of the internet. Both raise serious rights concerns. The Snowden revelations: Edward Snowden's 2013 disclosures revealed that the US National Security Agency (NSA) and its allies had built global mass surveillance systems that collected vast amounts of data on communications worldwide. The revelations showed that democratic governments had developed surveillance capabilities comparable to those imagined only in science fiction. Snowden's disclosures prompted significant legal reforms in some countries and are still shaping policy debates a decade later. The GDPR: the EU's General Data Protection Regulation (2018) is the most comprehensive data protection law in the world. It gives individuals rights over their data (to access, correct, delete), requires explicit consent for data processing, imposes large fines for violations, and has had significant global effects (many non-EU companies have adjusted to meet GDPR standards). Related laws have followed in California, Brazil, India, and elsewhere.

Platform power and content moderation

Major digital platforms (Meta/Facebook, Google, Twitter/X, TikTok, YouTube) now mediate significant portions of public discourse. Their content moderation decisions — what to leave up, what to remove, whom to deplatform — affect billions of people. The scale of their moderation is staggering: YouTube alone removes millions of videos per quarter. Content moderation raises genuinely difficult questions. Platforms remove content they judge harmful (hate speech, terrorism, child abuse, dangerous misinformation), but they also make many errors, are accused of both over-removal and under-removal, and face pressure from governments of all kinds. The EU's Digital Services Act (2022) requires major platforms to meet specific content standards and has created new regulatory mechanisms.

Children's digital rights

General Comment 25 (2021) of the UN Committee on the Rights of the Child applied the UNCRC to the digital environment.

Key principles include

Children's rights apply online; design of services should consider children; data protection for children should be stronger; education on digital literacy is essential; and children should participate in decisions that affect their digital lives. Specific concerns include addiction (particularly with engagement-maximising design), exposure to harmful content, grooming, sexual exploitation, mental health impacts, and data exploitation.

Digital authoritarianism

Authoritarian regimes use digital tools to consolidate power. Mass surveillance networks (China's, Russia's, Iran's among others), facial recognition, social credit systems, internet controls (the Great Firewall of China), and internet shutdowns are increasingly widespread. India has the world's highest number of internet shutdowns (often linked to political events or protests). These tools threaten democratic freedoms wherever they are deployed. The digital divide: global internet access has risen but gaps remain. About 2.6 billion people still lack internet access, concentrated in developing countries, rural areas, and among women (the UN reports a persistent gender digital divide). Within countries, the divide separates those with fast, reliable, affordable access from those without — shaping access to education, work, government services, and political participation. The COVID-19 pandemic made clear that digital access is now essential to full social participation.

Internet shutdowns

Governments increasingly respond to political events, protests, and elections by shutting down internet access entirely or restricting specific services. The #KeepItOn campaign and Access Now track these shutdowns — several hundred per year globally, with particular concentrations in India, Myanmar, Iran, Ethiopia, and several African countries. Shutdowns are associated with human rights violations, economic losses, and interference with education.

AI and new rights challenges

Artificial intelligence systems are reshaping what is at stake in digital rights. Facial recognition raises new surveillance concerns. Automated decision-making in welfare, hiring, lending, and criminal justice can produce discrimination at scale. Deepfakes threaten political discourse. Generative AI raises questions about authorship, misinformation, and the information environment. The EU AI Act (2024) represents the first major regulatory framework; debates continue globally.

Teaching note

This is a rapidly changing field where students may have more current knowledge of specific platforms than teachers. Focus on durable principles — privacy, free expression, safety, equality of access, accountability of power — that outlast specific apps.

Key Vocabulary
Digital rights
Human rights as they apply in the digital environment — encompassing privacy, freedom of expression, access to information, assembly, safety, and participation in digital life.
Data protection
Legal protections for personal data — including rights to access, correction, deletion, and consent to processing. Strongest in the EU's General Data Protection Regulation (GDPR).
Mass surveillance
Large-scale monitoring of populations — as opposed to targeted surveillance of specific suspects. Revealed in scale by the Snowden disclosures (2013); a major human rights concern.
Content moderation
The process by which digital platforms decide what user content to allow, restrict, or remove. A central site of modern free speech debates.
Digital divide
Inequality in access to digital technologies — along lines of income, geography, age, gender, and disability. Creates a gap between those who can and cannot participate fully in digital life.
Internet shutdown
Deliberate restriction or blocking of internet access by a government — often during political events, protests, or elections. Has become increasingly common worldwide.
Right to be forgotten
The right — established in EU law — to have certain personal data deleted from online services when it is no longer relevant or the person withdraws consent.
End-to-end encryption
A security method that ensures messages can only be read by sender and recipient — not by the service provider or third parties. Central to modern digital privacy; contested by some governments.
Deepfake
Synthetic media — typically video or audio — generated by AI to convincingly imitate real people. Raises new challenges for truth, privacy, and digital rights.
Algorithmic decision-making
The use of automated systems to make or shape decisions about individuals — in hiring, lending, welfare, policing, and many other areas. Raises concerns about transparency, bias, and accountability.
Classroom Activities
Activity 1 — Applying offline rights to online contexts
PurposeStudents engage with the challenge of translating established human rights into digital contexts.
How to run itExplain the UN's principle: the same rights that people have offline must also be protected online. Test this principle with specific cases. (1) Freedom of assembly: people have the right to gather peacefully. What does this mean for online group chats, virtual protests, or online communities? Can governments block video conferencing services during protests? Does this violate freedom of assembly? (2) Freedom of expression: people have the right to speak. What does this mean when a private platform removes their content? Is it censorship or private choice? Should governments require platforms to keep certain speech up? (3) Privacy: people have the right to privacy of communications. What does this mean when states monitor everyone's messages? Is bulk collection compatible with privacy if no one reads most of it? What about mass facial recognition in public spaces? (4) Assembly and press: do journalists online have the same protections as print journalists? Do citizen journalists? Do bloggers? Discuss: what is common across these cases? The principle is clear — rights apply online — but applying it requires difficult choices. The challenges come from scale (platforms affect billions), from private power (platforms are not governments), from the technical nature of the systems, and from the pace of change. Engaging with specific cases reveals how demanding the principle really is.
💡 Low-resource tipTeacher presents cases verbally. Students discuss in groups. No materials needed.
Activity 2 — The Snowden revelations
PurposeStudents engage with the defining modern case of digital rights.
How to run itTell the story. In June 2013, Edward Snowden, a contractor for the US National Security Agency (NSA), leaked classified documents to journalists Glenn Greenwald and Laura Poitras. The documents revealed that the US and its 'Five Eyes' allies (UK, Canada, Australia, New Zealand) had built mass surveillance systems that collected data from hundreds of millions of people globally — through direct access to major tech companies (PRISM), bulk collection of phone metadata, tapping of undersea cables, and cooperation from other intelligence services. The scope was far beyond what had been publicly acknowledged. Discuss the responses. Some called Snowden a hero exposing grave violations of human rights. Others called him a traitor who endangered national security. He fled to Hong Kong, then Russia, where he remains. Discuss the consequences. The USA FREEDOM Act (2015) modestly reformed US surveillance. The EU court struck down the 'safe harbor' data transfer agreement. Tech companies increased their use of encryption. International debate on surveillance intensified. But the core surveillance architecture remains largely in place, and in many countries, surveillance has expanded since. Discuss ethical questions. Should Snowden have acted? Some argue he should have used official whistleblower channels; supporters argue these channels were blocked. Was public interest served by the disclosures? Did the revelations damage genuine security interests? What is the appropriate balance between security and privacy in a democracy? Discuss: the Snowden case forced democracies to confront surveillance practices that had grown up in secret. The questions it raised are not settled.
💡 Low-resource tipTeacher tells the story verbally. Students discuss in groups. No materials needed.
Activity 3 — Platform power and content moderation
PurposeStudents engage with one of the most contested current digital rights issues.
How to run itPresent the situation. A handful of major platforms — Meta (Facebook, Instagram, WhatsApp), Google (YouTube, Search), ByteDance (TikTok), X (formerly Twitter) — now mediate much of public discourse for billions of people. Their decisions about what content to allow, remove, amplify, or suppress shape what people see, believe, and can say. Present the tensions. Platforms are under pressure from many directions. Some governments demand they remove more content (misinformation, hate speech, terrorism content). Others demand they keep content up (political speech, religious expression). Users complain both about too much and too little removal. Advertisers want 'brand-safe' environments. Employees sometimes push for specific changes. Civil society groups advocate for their concerns. Present specific cases. (1) Facebook's role in spreading anti-Rohingya content in Myanmar before 2017 ethnic cleansing — the company was later criticised for not acting sooner. (2) Twitter's decision to ban Donald Trump after 6 January 2021 — celebrated by some, criticised as platform overreach by others. (3) TikTok's alleged suppression of political content sensitive to China — raising questions about foreign platform influence. (4) YouTube's algorithms promoting extremist content at various points. Present regulatory responses. The EU's Digital Services Act (2022) imposes obligations on large platforms — transparency, due process, risk mitigation — without directly controlling content. National laws vary widely. Ask: what should the framework be? Private companies cannot be forced to host everything. Governments cannot be trusted to decide what speech is allowed. Users have legitimate interests in both safety and free expression. What arrangement balances these?
💡 Low-resource tipTeacher presents cases verbally. Students discuss in groups. No materials needed.
Discussion Questions
  • Q1The UN principle that human rights apply online is elegant but difficult to apply. What specific cases most challenge the principle?
  • Q2Mass surveillance by states is justified by governments on security grounds. Is there any amount of surveillance incompatible with a free society?
  • Q3Private platforms moderate speech for billions of people. Is this fundamentally different from state censorship, or essentially similar in effect?
  • Q4Children face particular risks online. What is the right balance between protection (content restrictions, age verification, parental controls) and autonomy (children's rights to information and participation)?
  • Q5Internet shutdowns by governments during protests violate multiple human rights. Why have they spread so rapidly despite this?
  • Q6AI systems are making decisions about people — who gets loans, jobs, welfare benefits. How can accountability and fairness be maintained in these systems?
  • Q7End-to-end encryption protects privacy but also makes it harder for law enforcement to investigate serious crimes. Where is the right balance?
Writing Tasks
Task 1 — Extended essay
'Privacy is a 20th-century concept that cannot survive in the digital age.' To what extent do you agree? Write 400 to 600 words.
Skills: Thesis-driven argument, engaging with privacy theory and digital reality, balanced analysis
Task 2 — Analytical response
Explain what the 'digital divide' is, why it has become more serious in the past decade, and what closing it would require. Write 200 to 300 words.
Skills: Explaining a concept, analysing a trend, proposing responses
Common Misconceptions
Common misconception

Privacy is only important if you have something to hide.

What to teach instead

Privacy protects everyone, not only wrongdoers. It enables the development of personal identity, intimate relationships, political opinions, religious beliefs, and creative expression without constant external observation. Without privacy, people self-censor, minority views are suppressed, and power concentrates in those who observe. Historical examples — from secret police in totalitarian states to the targeting of civil rights activists in democracies — show the real harms that follow loss of privacy. 'Nothing to hide' misunderstands what privacy is and why it matters.

Common misconception

Government surveillance is necessary for security and is therefore acceptable.

What to teach instead

The trade-off between security and privacy is genuine but the claim that any security concern justifies any surveillance is not. Mass surveillance has been shown to be of limited effectiveness for preventing specific attacks (most attacks are carried out by individuals already known to authorities). Targeted surveillance with judicial oversight can be effective; bulk surveillance of entire populations is disproportionate. International human rights law — as interpreted by the UN Human Rights Committee and major courts — requires that surveillance be necessary, proportionate, authorised by law, and subject to meaningful oversight. These standards are often violated.

Common misconception

Platform content moderation is censorship.

What to teach instead

Platform moderation is not censorship in the traditional legal sense — private platforms are not governments and have different legal positions. But the scale and influence of major platforms raise genuine questions about power and expression. The framing as either 'private choice' or 'censorship' is too simple. Platform decisions affect billions and deserve serious scrutiny — but this is different from state censorship, which raises different issues. The appropriate response involves transparency, due process, and regulation that preserves pluralism without replacing private judgement with government judgement.

Common misconception

Children are 'digital natives' and therefore naturally competent and safe online.

What to teach instead

Familiarity with digital technology is not the same as digital literacy or online safety. Research shows that young people are often no better than older people at identifying disinformation, recognising scams, or protecting their privacy. They may be more vulnerable to specific platform-based risks — TikTok algorithms, social media pressure, online harassment. The 'digital native' framing can obscure genuine risks and responsibilities. Young people need explicit education on digital literacy, privacy, and safety — their familiarity with devices does not provide this automatically.

Further Information

Key texts accessible to students: Shoshana Zuboff, 'The Age of Surveillance Capitalism' (2019) — foundational critique of commercial data collection; long but the key ideas are clear. Glenn Greenwald, 'No Place to Hide' (2014) — the insider account of the Snowden disclosures. Carole Cadwalladr's Cambridge Analytica investigation (Guardian, 2018) — on political data exploitation. For theory: Helen Nissenbaum, 'Privacy in Context' (2010); Daniel Solove, 'Understanding Privacy' (2008). For children and digital rights: UN Committee on the Rights of the Child General Comment 25 (2021). For current research: the Oxford Internet Institute (oii.ox.ac.uk), the Electronic Frontier Foundation (eff.org), Access Now (accessnow.org), Article 19 (article19.org), and Privacy International (privacyinternational.org) all publish accessible work. The Berkman Klein Center at Harvard (cyber.harvard.edu) is a major academic resource.