What propaganda and misinformation are, how they work, how they have been used in history and today, and how to defend ourselves against them.
Young children can begin to understand the basic idea behind media literacy through everyday experience. The core instincts to build are: not everything you hear is true; it is okay to pause and check; some stories are told to try to make us feel strong feelings or do something. Children do not need the words 'propaganda' or 'misinformation'. But they can begin to notice when something does not feel right, to ask questions, and to feel safe saying 'I don't know if that's true'. In a world where children are exposed to online information from very young ages, these skills are essential. Build them through examples from stories, playground rumours, and everyday conversations. The goal is not to make children mistrustful — it is to help them pause and think. No materials are needed.
If a grown-up said it, it must be true.
Most grown-ups try hard to tell the truth. But even grown-ups can be wrong, or can believe something that turns out not to be correct. It is always okay to ask 'how do you know?' or to check with someone else. This is not rude — it is how we learn.
If lots of people are saying the same thing, it must be true.
Sometimes many people believe something that is not true — especially if a story is exciting or scary. The number of people who believe something is not proof. What matters is whether there is good evidence. Asking 'how do we know?' and 'where did this come from?' is always a good idea.
Propaganda is communication designed to persuade people towards a particular view or action — usually in the service of a specific political or commercial goal. It is not simply information; it uses emotional appeals, selective facts, powerful images, and simple messages to shape beliefs. Propaganda has existed throughout history but became systematic in the 20th century, especially under the Nazi and Soviet regimes, both of which built vast propaganda machines to support their rule. Modern authoritarian regimes continue the tradition.
Misinformation is false information shared without knowing it is false — someone passes on a wrong claim because they believe it. Disinformation is false information deliberately created and spread to deceive. Both are serious problems; disinformation is more troubling because it is intentional. In the digital age, propaganda and disinformation have been transformed. Social media platforms allow false information to spread fast, cheaply, and to specific groups. Russian interference in elections (exposed in detail after 2016); coordinated disinformation campaigns about COVID-19 vaccines; propaganda networks supporting conspiracy theories; and the weaponisation of genuine disagreements — all show how the digital environment has enabled new forms of manipulation.
Deepfake videos, AI-generated articles, and bot networks that mimic human users. Techniques of propaganda are often predictable. They include: simple slogans that stick in memory; dividing people into 'us' and 'them'; using strong emotions (fear, anger, pride); creating enemies; repeating the same message many times; mixing truth with lies to make lies seem credible; attacking the messengers (journalists, scientists) rather than responding to their arguments; and making complex issues seem simple.
Media literacy (being able to evaluate sources); diverse information sources; critical thinking; fact-checking; and a culture that values accuracy. Media literacy is now a key life skill — arguably as important as reading and writing.
This topic is timely and important. Be careful to present examples from many political directions — propaganda and disinformation come from right, left, and centre, from governments and private actors, from domestic and foreign sources. Do not let the topic become a way of dismissing views you disagree with; help students think carefully about all sources, including ones that confirm their existing beliefs.
Only foolish or uneducated people fall for propaganda and misinformation.
Anyone can be fooled. Research shows that people of all ages, education levels, and political views can be taken in by well-designed disinformation. In fact, some well-educated people are particularly vulnerable because they are confident in their ability to tell truth from lies. The smart response is not to assume you cannot be fooled, but to develop habits (checking sources, pausing before sharing, seeking out different perspectives) that protect you.
Fact-checking means believing only 'official' sources.
Fact-checking is not about trusting the government or any single source. It is about looking at evidence — original documents, named sources, reliable data — and comparing different sources to see if they agree. Official sources can also be wrong or misleading. The point is to think carefully about evidence, not to accept one authority over another.
If something is funny or interesting, it does not matter if it is true.
Things that spread widely — funny memes, dramatic stories, shocking claims — often matter most because they reach the most people. A funny lie that millions of people share can shape what those people believe about a serious issue. In the age of social media, there is no clear line between 'entertainment' and 'information'. The habit of checking applies to everything you might share.
Propaganda and disinformation are among the most urgent civic issues of the 21st century. Understanding their forms and history is essential for secondary teaching. The history of propaganda: propaganda in a loose sense is as old as politics, but its systematic modern form begins in the early 20th century. The Committee on Public Information (US, WWI) pioneered large-scale democratic propaganda. The Nazi Propaganda Ministry under Joseph Goebbels built the most sophisticated totalitarian propaganda machine of the 20th century, with control of radio, film, press, and public culture. The Soviet Union maintained a comparable system with different aesthetics. Post-war, propaganda techniques were transferred into commercial advertising and public relations, largely by figures like Edward Bernays (Freud's nephew), who wrote 'Propaganda' (1928) and 'Public Relations' (1952) and invented many modern persuasion techniques. Classical propaganda techniques were catalogued by the Institute for Propaganda Analysis (1937-1942) and include: name-calling (attaching negative labels), glittering generalities (vague appealing phrases), transfer (associating a cause with respected symbols), testimonial (endorsement by respected figures), plain folks (claiming to be ordinary), card-stacking (presenting only one side), bandwagon (everyone is doing it), and appeals to fear, anger, or belonging. These techniques have not fundamentally changed; the channels have.
Three things distinguish disinformation in the digital age. First, scale — messages can reach hundreds of millions of people without traditional media gatekeepers. Second, targeting — platforms allow disinformation to be directed precisely at specific demographic groups. Third, amplification — algorithms reward engagement, and engagement is highest for emotionally charged content, which disinformation typically exploits.
Russian interference in the 2016 US election (documented in the Mueller Report and Senate Intelligence Committee reports); the Cambridge Analytica scandal; coordinated disinformation around COVID-19; the role of WhatsApp and Facebook in spreading lynching rumours in India; and the role of social media in inciting ethnic violence in Myanmar.
State-backed information operations have become a recognised tool of geopolitics. Russia's Internet Research Agency has conducted systematic operations targeting multiple democracies. China's influence operations have intensified, particularly around Hong Kong, Taiwan, and Xinjiang narratives. Iran, North Korea, and others conduct similar operations. The US, UK, and other democracies also conduct information operations abroad. The line between legitimate public diplomacy and illegitimate interference is contested.
Generative AI has made it possible to produce convincing fake text, images, audio, and video at scale. Deepfakes — AI-generated videos of real people saying things they never said — are already in circulation. AI-generated articles can flood the internet with plausible-sounding but entirely fabricated content. Bot networks using AI can mimic human users more convincingly than ever. The emerging concern is that the information environment itself may become impossible to trust.
Media literacy education has been rolled out in many democracies, with some evidence of effectiveness. Fact-checking organisations (PolitiFact, FactCheck.org, Full Fact, AFP Fact Check, and many others) verify major claims. Platform regulation (the EU's Digital Services Act) attempts to force platforms to address disinformation while preserving speech. Provenance standards (watermarking AI content, content authenticity initiatives) are being developed. None of these is a complete solution. The limits of fact-checking: research suggests fact-checking's effectiveness is limited by 'motivated reasoning' — people evaluate information based on whether it fits their existing beliefs rather than purely on evidence. This makes simple 'here is the truth' corrections insufficient. More effective approaches may include 'prebunking' (showing people how manipulation works before they encounter it), improving general critical thinking, and rebuilding trust in institutions.
Disinformation is politically loaded. Be careful to use examples from all political directions and from both authoritarian and democratic sources. The goal is to develop students' critical thinking, not to endorse any particular political position.
Propaganda is something only authoritarian governments do.
Propaganda techniques are used by many actors — governments of all kinds, political parties, businesses, and advocacy groups. The Nazi and Soviet regimes built the most extensive propaganda machines in history, but democratic governments have used propaganda too (especially during wartime), and commercial advertising uses many of the same techniques. The distinction between 'propaganda' (bad, done by them) and 'communication' (good, done by us) is often a rhetorical move rather than an analytical one. Recognising propaganda techniques in all their forms — including those used by people we agree with — is the beginning of genuine media literacy.
Fact-checking can solve the disinformation problem.
Research consistently shows that fact-checking has limited effectiveness, especially when the original claim fits what people already want to believe. Motivated reasoning means people process information through the lens of their existing beliefs and identities, not purely through evidence. Fact-checking corrects the factual record but often does not change minds. More effective approaches include prebunking (showing people how manipulation works before they encounter it), rebuilding trust in institutions, and addressing the underlying conditions (economic anxiety, social alienation) that make disinformation attractive. Fact-checking matters but cannot do the job alone.
Young 'digital native' people are better at spotting disinformation than older people.
The evidence is more complicated. Younger people are more confident in their media literacy, but studies often find they are no better — and sometimes worse — at actually identifying false information than older people. Older people are more likely to share disinformation (perhaps because they share more generally on the platforms they use). Younger people may be more susceptible to certain platform-specific forms (TikTok-based conspiracy theories). Being 'digital native' is not the same as being media-literate; media literacy requires active learning and practice, not just familiarity with technology.
Regulating disinformation is the same as censorship.
The relationship between regulating disinformation and censorship is genuinely difficult, but they are not the same thing. Censorship typically means the state prohibiting expression of particular ideas. Regulating disinformation might include: transparency requirements (platforms must disclose advertising sources); anti-manipulation rules (against coordinated inauthentic behaviour); provenance standards (labelling AI-generated content); and strict rules against specific harms like election fraud or health misinformation that creates imminent danger. Reasonable people disagree about where these become censorship, but the claim that any regulation is censorship shuts off discussion of legitimate responses to genuine harms.
Key texts accessible to students: Edward Bernays, 'Propaganda' (1928) — a remarkably frank account of modern persuasion from one of its pioneers. Hannah Arendt, 'The Origins of Totalitarianism' (1951) — essential for understanding totalitarian propaganda. Jacques Ellul, 'Propaganda: The Formation of Men's Attitudes' (1965) — the most systematic theoretical account. For modern disinformation: Peter Pomerantsev, 'This Is Not Propaganda' (2019) — vivid and accessible. Yochai Benkler, Robert Faris, and Hal Roberts, 'Network Propaganda' (2018) — the best empirical study of the American disinformation ecosystem. Renée DiResta's writings on platforms and disinformation are excellent and accessible. For the AI challenge: Nina Schick, 'Deepfakes' (2020). For practical media literacy: the work of Mike Caulfield, particularly his SIFT method (Stop, Investigate the source, Find better coverage, Trace claims). Resources: the International Fact-Checking Network (ifcncodeofprinciples.poynter.org) lists accredited fact-checkers worldwide. The Shorenstein Center at Harvard publishes extensively on disinformation. First Draft News (firstdraftnews.org) and the Berkman Klein Center have useful resources for students.
Your feedback helps other teachers and helps us improve TeachAnyClass.