What digital rights are, why privacy matters online, how the internet has changed human rights, and how to stay safe online without giving up freedom.
Young children increasingly grow up with screens and apps from very early ages. Even if they are not yet online themselves, they see adults and older children using devices constantly. The core instincts to build at this age are: some things about you are private and should stay private; when you are online, you should still be kind to others; not everyone online is who they seem; a trusted adult is the right person to ask if something worries you. Children do not need technical vocabulary. But building early instincts about privacy, kindness, and trust prepares them for safer online lives later. Be sensitive — children may see things on screens their parents do not know about. The goal is building safety through instinct, not frightening children. No materials are needed.
Words on a screen do not really hurt people.
Words online are just as real as spoken words. The person reading them has real feelings. In some ways, mean words online can hurt even more — because many people might see them, and they stay around for a long time. Being kind online is not a smaller thing than being kind in person. It is the same thing.
If someone is nice online, they must be nice in real life.
Online, people can pretend to be anyone they want. A person who says they are your age might be much older. A person who seems kind might be trying to trick you. This does not mean everyone online is bad — most people are not. But until you know for sure, be careful. Never meet someone in person, give them your address, or share pictures with someone you only know from online, without a trusted grown-up knowing.
Digital rights are the human rights that apply in the digital environment — online and in the wider world shaped by digital technology. The UN Human Rights Council has affirmed (since 2012) that 'the same rights that people have offline must also be protected online'. This covers freedom of expression, freedom of assembly, the right to privacy, the right to education, the right to seek and receive information, and the right to take part in cultural and political life. Digital rights are not entirely new rights — they are existing human rights applied to digital contexts.
Privacy and data protection (governments and companies collect huge amounts of personal information); freedom of expression online (including the challenge of content moderation); access to the internet itself (the 'digital divide' between those who have it and those who do not); protection from surveillance (especially mass surveillance by states); online safety, particularly for children and vulnerable groups; and the right to be forgotten (to have personal data deleted). Privacy online is a central and contested right. Every device people use collects data. Apps track location, contacts, communications, purchases, and interests. This data is used for advertising, sold to third parties, and in some cases shared with governments. The scale of data collection is unprecedented in history. Major laws like the EU's General Data Protection Regulation (GDPR, 2018) have tried to give individuals more control, but in most of the world, people have limited protection.
Contact with strangers, inappropriate content, cyberbullying, exploitation, and the collection of their personal data from very early ages. The UN Committee on the Rights of the Child issued General Comment 25 (2021) specifically on children's rights in the digital environment.
Children's rights apply online; design of services should account for children; parents and caregivers should have support; education on digital literacy is essential. The digital divide — unequal access to the internet — is a major issue. About two-thirds of the world's population now has internet access, but gaps remain. Access is lower in rural areas, poorer countries, among older people, and among women in many contexts. Those without access are increasingly excluded from education, work, banking, and political life. Cyberbullying — bullying that takes place online — can be relentless because it follows victims everywhere, reaches large audiences, and is difficult to escape. Research shows serious consequences for mental health, especially among young people.
This topic is changing rapidly, and students may know more about specific platforms than teachers do. Focus on durable principles — privacy, kindness, safety, trust — rather than specific apps that will change. Approach with care given that many children encounter difficult content online.
If something is online, it is probably free and without cost.
Most 'free' online services — social media, email, games, search engines — are paid for by collecting your data and showing you ads. You are not paying with money, but you are paying with information about yourself. This is not necessarily wrong, but it is a trade: your data for the service. The saying 'if the product is free, you are the product' captures this. Understanding this trade helps you make better choices about what to use and what to share.
I have nothing to hide, so privacy does not matter to me.
Privacy is not about hiding wrongdoing; it is about controlling who knows what about you. Even people with nothing to hide benefit from privacy — keeping medical information, finances, relationships, and private conversations private. Loss of privacy can lead to harms you did not expect: identity theft, targeted manipulation, unfair treatment, or unwanted contact. Everyone needs some privacy to develop their own life without constant judgement.
Deleting something online means it is gone forever.
Once something is shared online, it may be copied, screenshotted, or cached by many systems. Even if you delete it from one place, copies may remain elsewhere. Anything sent to another person is out of your control. This is why thinking before sharing matters so much — you cannot always take things back. This is not a reason to never share anything, but a reason to be thoughtful.
Digital rights are one of the fastest-evolving areas of human rights. Understanding the main frameworks and current debates is essential for teaching at secondary level. The principle of online/offline equivalence: the UN Human Rights Council resolved in 2012 (and repeatedly since) that 'the same rights that people have offline must also be protected online'. This is the foundational principle of digital rights — existing human rights do not disappear online but must be applied to new contexts. The challenge is that applying them requires significant reinterpretation: what does 'freedom of assembly' mean in online spaces? What does 'privacy' mean when every device collects data? What does 'freedom of expression' mean when private platforms control public discourse?
Privacy is protected by Article 12 of the UDHR and Article 17 of the ICCPR. Modern digital privacy concerns fall into two categories. Government surveillance — where states monitor citizens' communications, movements, and activities — has expanded dramatically since the 'war on terror' began in 2001. Commercial surveillance — where companies collect personal data for profit — has become the business model of the internet. Both raise serious rights concerns. The Snowden revelations: Edward Snowden's 2013 disclosures revealed that the US National Security Agency (NSA) and its allies had built global mass surveillance systems that collected vast amounts of data on communications worldwide. The revelations showed that democratic governments had developed surveillance capabilities comparable to those imagined only in science fiction. Snowden's disclosures prompted significant legal reforms in some countries and are still shaping policy debates a decade later. The GDPR: the EU's General Data Protection Regulation (2018) is the most comprehensive data protection law in the world. It gives individuals rights over their data (to access, correct, delete), requires explicit consent for data processing, imposes large fines for violations, and has had significant global effects (many non-EU companies have adjusted to meet GDPR standards). Related laws have followed in California, Brazil, India, and elsewhere.
Major digital platforms (Meta/Facebook, Google, Twitter/X, TikTok, YouTube) now mediate significant portions of public discourse. Their content moderation decisions — what to leave up, what to remove, whom to deplatform — affect billions of people. The scale of their moderation is staggering: YouTube alone removes millions of videos per quarter. Content moderation raises genuinely difficult questions. Platforms remove content they judge harmful (hate speech, terrorism, child abuse, dangerous misinformation), but they also make many errors, are accused of both over-removal and under-removal, and face pressure from governments of all kinds. The EU's Digital Services Act (2022) requires major platforms to meet specific content standards and has created new regulatory mechanisms.
General Comment 25 (2021) of the UN Committee on the Rights of the Child applied the UNCRC to the digital environment.
Children's rights apply online; design of services should consider children; data protection for children should be stronger; education on digital literacy is essential; and children should participate in decisions that affect their digital lives. Specific concerns include addiction (particularly with engagement-maximising design), exposure to harmful content, grooming, sexual exploitation, mental health impacts, and data exploitation.
Authoritarian regimes use digital tools to consolidate power. Mass surveillance networks (China's, Russia's, Iran's among others), facial recognition, social credit systems, internet controls (the Great Firewall of China), and internet shutdowns are increasingly widespread. India has the world's highest number of internet shutdowns (often linked to political events or protests). These tools threaten democratic freedoms wherever they are deployed. The digital divide: global internet access has risen but gaps remain. About 2.6 billion people still lack internet access, concentrated in developing countries, rural areas, and among women (the UN reports a persistent gender digital divide). Within countries, the divide separates those with fast, reliable, affordable access from those without — shaping access to education, work, government services, and political participation. The COVID-19 pandemic made clear that digital access is now essential to full social participation.
Governments increasingly respond to political events, protests, and elections by shutting down internet access entirely or restricting specific services. The #KeepItOn campaign and Access Now track these shutdowns — several hundred per year globally, with particular concentrations in India, Myanmar, Iran, Ethiopia, and several African countries. Shutdowns are associated with human rights violations, economic losses, and interference with education.
Artificial intelligence systems are reshaping what is at stake in digital rights. Facial recognition raises new surveillance concerns. Automated decision-making in welfare, hiring, lending, and criminal justice can produce discrimination at scale. Deepfakes threaten political discourse. Generative AI raises questions about authorship, misinformation, and the information environment. The EU AI Act (2024) represents the first major regulatory framework; debates continue globally.
This is a rapidly changing field where students may have more current knowledge of specific platforms than teachers. Focus on durable principles — privacy, free expression, safety, equality of access, accountability of power — that outlast specific apps.
Privacy is only important if you have something to hide.
Privacy protects everyone, not only wrongdoers. It enables the development of personal identity, intimate relationships, political opinions, religious beliefs, and creative expression without constant external observation. Without privacy, people self-censor, minority views are suppressed, and power concentrates in those who observe. Historical examples — from secret police in totalitarian states to the targeting of civil rights activists in democracies — show the real harms that follow loss of privacy. 'Nothing to hide' misunderstands what privacy is and why it matters.
Government surveillance is necessary for security and is therefore acceptable.
The trade-off between security and privacy is genuine but the claim that any security concern justifies any surveillance is not. Mass surveillance has been shown to be of limited effectiveness for preventing specific attacks (most attacks are carried out by individuals already known to authorities). Targeted surveillance with judicial oversight can be effective; bulk surveillance of entire populations is disproportionate. International human rights law — as interpreted by the UN Human Rights Committee and major courts — requires that surveillance be necessary, proportionate, authorised by law, and subject to meaningful oversight. These standards are often violated.
Platform content moderation is censorship.
Platform moderation is not censorship in the traditional legal sense — private platforms are not governments and have different legal positions. But the scale and influence of major platforms raise genuine questions about power and expression. The framing as either 'private choice' or 'censorship' is too simple. Platform decisions affect billions and deserve serious scrutiny — but this is different from state censorship, which raises different issues. The appropriate response involves transparency, due process, and regulation that preserves pluralism without replacing private judgement with government judgement.
Children are 'digital natives' and therefore naturally competent and safe online.
Familiarity with digital technology is not the same as digital literacy or online safety. Research shows that young people are often no better than older people at identifying disinformation, recognising scams, or protecting their privacy. They may be more vulnerable to specific platform-based risks — TikTok algorithms, social media pressure, online harassment. The 'digital native' framing can obscure genuine risks and responsibilities. Young people need explicit education on digital literacy, privacy, and safety — their familiarity with devices does not provide this automatically.
Key texts accessible to students: Shoshana Zuboff, 'The Age of Surveillance Capitalism' (2019) — foundational critique of commercial data collection; long but the key ideas are clear. Glenn Greenwald, 'No Place to Hide' (2014) — the insider account of the Snowden disclosures. Carole Cadwalladr's Cambridge Analytica investigation (Guardian, 2018) — on political data exploitation. For theory: Helen Nissenbaum, 'Privacy in Context' (2010); Daniel Solove, 'Understanding Privacy' (2008). For children and digital rights: UN Committee on the Rights of the Child General Comment 25 (2021). For current research: the Oxford Internet Institute (oii.ox.ac.uk), the Electronic Frontier Foundation (eff.org), Access Now (accessnow.org), Article 19 (article19.org), and Privacy International (privacyinternational.org) all publish accessible work. The Berkman Klein Center at Harvard (cyber.harvard.edu) is a major academic resource.
Your feedback helps other teachers and helps us improve TeachAnyClass.