All Concepts
Human Rights

Propaganda and Misinformation

What propaganda and misinformation are, how they work, how they have been used in history and today, and how to defend ourselves against them.

Core Ideas
1 Not everything people say is true
2 It is good to check what you hear
3 Some stories are made to make us feel strong feelings
4 It is okay to say 'I do not know yet'
5 Asking a grown-up you trust can help
Background for Teachers

Young children can begin to understand the basic idea behind media literacy through everyday experience. The core instincts to build are: not everything you hear is true; it is okay to pause and check; some stories are told to try to make us feel strong feelings or do something. Children do not need the words 'propaganda' or 'misinformation'. But they can begin to notice when something does not feel right, to ask questions, and to feel safe saying 'I don't know if that's true'. In a world where children are exposed to online information from very young ages, these skills are essential. Build them through examples from stories, playground rumours, and everyday conversations. The goal is not to make children mistrustful — it is to help them pause and think. No materials are needed.

Classroom Activities
Activity 1 — Is it really true?
PurposeChildren practise noticing that not everything they hear is true.
How to run itPlay a simple game. Make some statements — some true, some made up. For each, ask: is this true? Do we know for sure? For example: 'It is raining outside right now.' (Check!) 'All cats have six legs.' (We know it's wrong.) 'There is a dinosaur in the playground.' (Let's think — is that likely?) 'Your friend said she has a pet tiger.' (Hmm — should we ask more questions?) Discuss: some things we can check. Some things sound wrong straight away. Some things are hard to tell. It is always okay to ask, 'How do you know?' Ask: what do you do when you are not sure if something is true?
💡 Low-resource tipDiscussion only. No materials needed.
Activity 2 — Strong feelings and careful thinking
PurposeChildren notice that some stories are designed to make us feel strongly — and that strong feelings can make us less careful.
How to run itTell two versions of the same story. Version 1 (calm): 'A dog ran across the road. A car had to slow down. Everyone is fine.' Version 2 (dramatic): 'A terrible dog ran into the road! A car nearly crashed! Everyone was in danger!' Ask: which version made you feel more? Which version is more honest? Discuss: sometimes people tell stories in a way that makes us scared or angry, to make us pay attention. This can be okay — dramatic stories are fun. But if we are making a real decision, we should look past the feelings and ask what really happened. Strong feelings can make it harder to think carefully.
💡 Low-resource tipTell the stories verbally. No materials needed.
Activity 3 — Asking a grown-up you trust
PurposeChildren learn that when they are unsure, they can ask a trusted adult.
How to run itAsk: if you hear something and you are not sure if it is true, who could you ask? Collect answers: a parent, a teacher, an older sister or brother, a family friend. Explain: when grown-ups do not know something, they also ask other people they trust. This is normal. You are not silly for not knowing. Finding out is how we learn. Practice: say a few fake statements and have children practise saying, 'I'm not sure — I'll ask someone.' For example: 'Someone told me ice cream makes you grow taller.' 'I heard the moon is made of cheese.' 'My friend said lions live in cold places.' Discuss: there is no shame in saying 'I don't know yet — let me find out.'
💡 Low-resource tipDiscussion only. No materials needed.
Discussion Questions
  • Q1Has someone ever told you something that turned out not to be true? What happened?
  • Q2When you are not sure if something is true, what can you do?
  • Q3Can a story be fun without being exactly true? When is that okay, and when is it not?
  • Q4Who are the grown-ups you trust to help you check things?
  • Q5Why might someone make up a story or tell it in a very dramatic way?
Writing Tasks
Drawing task
Draw a picture of a child checking if something is true. Write or say: The child heard ___________. To check, they ___________.
Skills: Understanding the act of checking information
Sentence completion
If I hear something and I am not sure, I can ___________. Not everything people say is ___________.
Skills: Articulating strategies for handling uncertain information
Common Misconceptions
Common misconception

If a grown-up said it, it must be true.

What to teach instead

Most grown-ups try hard to tell the truth. But even grown-ups can be wrong, or can believe something that turns out not to be correct. It is always okay to ask 'how do you know?' or to check with someone else. This is not rude — it is how we learn.

Common misconception

If lots of people are saying the same thing, it must be true.

What to teach instead

Sometimes many people believe something that is not true — especially if a story is exciting or scary. The number of people who believe something is not proof. What matters is whether there is good evidence. Asking 'how do we know?' and 'where did this come from?' is always a good idea.

Core Ideas
1 What propaganda is
2 Misinformation and disinformation — the difference
3 How false information spreads
4 Propaganda in history
5 Modern propaganda in the digital age
6 How to check what you read
Background for Teachers

Propaganda is communication designed to persuade people towards a particular view or action — usually in the service of a specific political or commercial goal. It is not simply information; it uses emotional appeals, selective facts, powerful images, and simple messages to shape beliefs. Propaganda has existed throughout history but became systematic in the 20th century, especially under the Nazi and Soviet regimes, both of which built vast propaganda machines to support their rule. Modern authoritarian regimes continue the tradition.

Related but distinct terms

Misinformation is false information shared without knowing it is false — someone passes on a wrong claim because they believe it. Disinformation is false information deliberately created and spread to deceive. Both are serious problems; disinformation is more troubling because it is intentional. In the digital age, propaganda and disinformation have been transformed. Social media platforms allow false information to spread fast, cheaply, and to specific groups. Russian interference in elections (exposed in detail after 2016); coordinated disinformation campaigns about COVID-19 vaccines; propaganda networks supporting conspiracy theories; and the weaponisation of genuine disagreements — all show how the digital environment has enabled new forms of manipulation.

Artificial intelligence is making this worse

Deepfake videos, AI-generated articles, and bot networks that mimic human users. Techniques of propaganda are often predictable. They include: simple slogans that stick in memory; dividing people into 'us' and 'them'; using strong emotions (fear, anger, pride); creating enemies; repeating the same message many times; mixing truth with lies to make lies seem credible; attacking the messengers (journalists, scientists) rather than responding to their arguments; and making complex issues seem simple.

Defences include

Media literacy (being able to evaluate sources); diverse information sources; critical thinking; fact-checking; and a culture that values accuracy. Media literacy is now a key life skill — arguably as important as reading and writing.

Teaching note

This topic is timely and important. Be careful to present examples from many political directions — propaganda and disinformation come from right, left, and centre, from governments and private actors, from domestic and foreign sources. Do not let the topic become a way of dismissing views you disagree with; help students think carefully about all sources, including ones that confirm their existing beliefs.

Key Vocabulary
Propaganda
Information designed to persuade people towards a particular view or action — often using strong emotions, simple messages, and selective facts.
Misinformation
False or wrong information — shared by people who do not know it is wrong.
Disinformation
False information that is created and shared on purpose, to deceive people.
Fact-checking
Checking whether a claim is true — by looking at evidence, original sources, and reliable data.
Conspiracy theory
A belief that important events are being secretly controlled by a hidden group — usually without good evidence.
Deepfake
A fake video or audio clip — made using computer technology — that makes someone appear to say or do something they did not really do.
Bot
A computer program that pretends to be a real person online — often used to spread messages quickly and make a view seem more popular than it is.
Media literacy
The ability to understand, evaluate, and think carefully about the information we see in news, online, and elsewhere.
Classroom Activities
Activity 1 — Propaganda in history
PurposeStudents see how propaganda has been used by powerful governments to shape public opinion.
How to run itExplain that propaganda has a long history, but became most powerful in the 20th century. Present brief examples. (1) Nazi Germany: the Nazis built a huge propaganda system under Joseph Goebbels. Posters, films, and radio broadcasts promoted Hitler, presented Germans as superior, and dehumanised Jews. This propaganda helped make the Holocaust possible. (2) The Soviet Union: Soviet propaganda glorified Stalin, hid the real conditions of life, and promoted communism as the future. (3) Wartime propaganda: almost all countries in major wars have used propaganda — including democracies. Posters urging people to support the war effort, films demonising the enemy, and controlled news during conflict. (4) More recent: Rwandan state radio in 1994 played a central role in inciting the genocide against the Tutsi. Ask: what do these examples have in common? Usually: simple messages, emotional appeals, an 'enemy' group, control of the media, and repetition. Discuss: why are powerful governments so interested in controlling what people believe? Because beliefs shape action. If people believe what the government wants, they will do what it wants — or at least not oppose it.
💡 Low-resource tipTeacher presents examples verbally. Students discuss. No materials needed.
Activity 2 — The techniques of propaganda
PurposeStudents learn to recognise common propaganda techniques and apply them to examples.
How to run itPresent seven common techniques. (1) Simple slogan: one short message, easy to remember. (2) Us and them: dividing the world into good people (us) and bad people (them). (3) Appeal to emotion: using fear, anger, or pride to make you feel rather than think. (4) Repeated message: saying the same thing over and over until it feels true. (5) Attack the messenger: instead of answering a criticism, attack the person making it. (6) Mix truth and lies: making lies easier to believe by mixing them with true things. (7) Making complicated things seem simple: pretending there is one cause and one solution when the real world is complicated. Then present some short fictional examples of messages. For each, ask students which techniques are being used. For example: 'Our great nation is under attack by foreigners! Only ONE LEADER can save us! The journalists who criticise him are traitors!' — slogans, us/them, fear, attacking messengers, simplification. Discuss: once you know the techniques, you can spot them. This is a life skill.
💡 Low-resource tipTeacher lists techniques on the board and reads examples. No materials needed.
Activity 3 — How to check a story
PurposeStudents learn practical tools for evaluating whether a story or claim is trustworthy.
How to run itPresent a simple checklist to use when you see a surprising or dramatic claim. (1) Who is saying this? A known journalist? An anonymous account? A government? A business with something to gain? (2) Where is the evidence? Are there named sources, documents, or data? Or just vague claims? (3) Do other reliable sources report the same thing? A dramatic story that only one source has should make you pause. (4) Is this designed to make me feel strong emotions? If yes, slow down — emotion can make it harder to think clearly. (5) Does the source admit mistakes and correct them? (6) When was this first published? Old stories sometimes get shared as if they are new. Practice with short fictional examples and have students ask each question. Discuss: you do not have to check every single thing you see — that would be impossible. But for big, dramatic, or surprising claims, a few seconds of checking can save you from being fooled.
💡 Low-resource tipTeacher writes the checklist on the board. No printed materials needed.
Discussion Questions
  • Q1Have you ever seen something online that turned out not to be true? How did you find out?
  • Q2Why do you think some people deliberately spread false information?
  • Q3Can you think of a time when a story made you feel very strong emotions? Did you stop to check it?
  • Q4Is it always possible to tell what is true and what is not? What do you do when you can't?
  • Q5Whose job is it to stop propaganda and misinformation — governments, platforms, schools, or individuals?
  • Q6Is there a difference between a joke, a mistake, and a deliberate lie? How do we tell them apart?
Writing Tasks
Task 1 — Explain and give an example
Explain the difference between misinformation and disinformation and give ONE example of how either can cause harm. Write 3 to 5 sentences.
Skills: Explanation writing, distinguishing two concepts, giving examples
Task 2 — Short argument
Explain why learning to check information carefully is an important life skill today. Write 4 to 6 sentences.
Skills: Reasoning, understanding contemporary media environment, connecting individual skill to wider outcomes
Common Misconceptions
Common misconception

Only foolish or uneducated people fall for propaganda and misinformation.

What to teach instead

Anyone can be fooled. Research shows that people of all ages, education levels, and political views can be taken in by well-designed disinformation. In fact, some well-educated people are particularly vulnerable because they are confident in their ability to tell truth from lies. The smart response is not to assume you cannot be fooled, but to develop habits (checking sources, pausing before sharing, seeking out different perspectives) that protect you.

Common misconception

Fact-checking means believing only 'official' sources.

What to teach instead

Fact-checking is not about trusting the government or any single source. It is about looking at evidence — original documents, named sources, reliable data — and comparing different sources to see if they agree. Official sources can also be wrong or misleading. The point is to think carefully about evidence, not to accept one authority over another.

Common misconception

If something is funny or interesting, it does not matter if it is true.

What to teach instead

Things that spread widely — funny memes, dramatic stories, shocking claims — often matter most because they reach the most people. A funny lie that millions of people share can shape what those people believe about a serious issue. In the age of social media, there is no clear line between 'entertainment' and 'information'. The habit of checking applies to everything you might share.

Core Ideas
1 The history of propaganda
2 Bernays and the emergence of modern public relations
3 Propaganda techniques — classical and new
4 Disinformation in the digital age
5 Foreign interference in democracies
6 AI and the new wave
7 Defending democracy — media literacy and institutions
8 The limits of fact-checking
Background for Teachers

Propaganda and disinformation are among the most urgent civic issues of the 21st century. Understanding their forms and history is essential for secondary teaching. The history of propaganda: propaganda in a loose sense is as old as politics, but its systematic modern form begins in the early 20th century. The Committee on Public Information (US, WWI) pioneered large-scale democratic propaganda. The Nazi Propaganda Ministry under Joseph Goebbels built the most sophisticated totalitarian propaganda machine of the 20th century, with control of radio, film, press, and public culture. The Soviet Union maintained a comparable system with different aesthetics. Post-war, propaganda techniques were transferred into commercial advertising and public relations, largely by figures like Edward Bernays (Freud's nephew), who wrote 'Propaganda' (1928) and 'Public Relations' (1952) and invented many modern persuasion techniques. Classical propaganda techniques were catalogued by the Institute for Propaganda Analysis (1937-1942) and include: name-calling (attaching negative labels), glittering generalities (vague appealing phrases), transfer (associating a cause with respected symbols), testimonial (endorsement by respected figures), plain folks (claiming to be ordinary), card-stacking (presenting only one side), bandwagon (everyone is doing it), and appeals to fear, anger, or belonging. These techniques have not fundamentally changed; the channels have.

Digital disinformation

Three things distinguish disinformation in the digital age. First, scale — messages can reach hundreds of millions of people without traditional media gatekeepers. Second, targeting — platforms allow disinformation to be directed precisely at specific demographic groups. Third, amplification — algorithms reward engagement, and engagement is highest for emotionally charged content, which disinformation typically exploits.

Major cases include

Russian interference in the 2016 US election (documented in the Mueller Report and Senate Intelligence Committee reports); the Cambridge Analytica scandal; coordinated disinformation around COVID-19; the role of WhatsApp and Facebook in spreading lynching rumours in India; and the role of social media in inciting ethnic violence in Myanmar.

Foreign interference

State-backed information operations have become a recognised tool of geopolitics. Russia's Internet Research Agency has conducted systematic operations targeting multiple democracies. China's influence operations have intensified, particularly around Hong Kong, Taiwan, and Xinjiang narratives. Iran, North Korea, and others conduct similar operations. The US, UK, and other democracies also conduct information operations abroad. The line between legitimate public diplomacy and illegitimate interference is contested.

AI and the new wave

Generative AI has made it possible to produce convincing fake text, images, audio, and video at scale. Deepfakes — AI-generated videos of real people saying things they never said — are already in circulation. AI-generated articles can flood the internet with plausible-sounding but entirely fabricated content. Bot networks using AI can mimic human users more convincingly than ever. The emerging concern is that the information environment itself may become impossible to trust.

Defensive measures

Media literacy education has been rolled out in many democracies, with some evidence of effectiveness. Fact-checking organisations (PolitiFact, FactCheck.org, Full Fact, AFP Fact Check, and many others) verify major claims. Platform regulation (the EU's Digital Services Act) attempts to force platforms to address disinformation while preserving speech. Provenance standards (watermarking AI content, content authenticity initiatives) are being developed. None of these is a complete solution. The limits of fact-checking: research suggests fact-checking's effectiveness is limited by 'motivated reasoning' — people evaluate information based on whether it fits their existing beliefs rather than purely on evidence. This makes simple 'here is the truth' corrections insufficient. More effective approaches may include 'prebunking' (showing people how manipulation works before they encounter it), improving general critical thinking, and rebuilding trust in institutions.

Teaching note

Disinformation is politically loaded. Be careful to use examples from all political directions and from both authoritarian and democratic sources. The goal is to develop students' critical thinking, not to endorse any particular political position.

Key Vocabulary
Propaganda
Communication systematically designed to influence the attitudes or behaviour of an audience, typically through emotional appeals, selective presentation of facts, and simple messages, in service of a specific cause or interest.
Disinformation
False information created and spread deliberately to deceive — distinguished from misinformation (which is spread unintentionally) by the intent of those who create it.
Malinformation
Genuine information — often private — that is shared out of context or with malicious intent to cause harm. Distinguished from disinformation, which involves falsehood.
Motivated reasoning
The psychological tendency to evaluate information based on whether it fits one's existing beliefs, identity, or preferences, rather than on evidence alone. A major obstacle to traditional fact-checking.
Filter bubble
The situation in which a person is exposed primarily to information that confirms their existing views — typically through algorithmic curation on digital platforms.
Echo chamber
A social environment in which people encounter only beliefs or opinions that match their own — related to filter bubbles but emphasising social, not algorithmic, selection.
Astroturfing
The practice of creating fake grassroots movements — pretending that a campaign is spontaneous popular support when it is actually coordinated by a particular interest.
Deepfake
Synthetic media — usually video or audio — generated by AI to convincingly depict someone saying or doing something they did not. An emerging tool of disinformation.
Prebunking
A preventative approach to disinformation in which people are shown how manipulation techniques work before they encounter actual disinformation — shown to produce more durable effects than after-the-fact fact-checking.
Information operation
A coordinated campaign — typically state-sponsored — to influence the information environment in a target country, through disinformation, amplification of genuine disagreements, and other methods.
Classroom Activities
Activity 1 — Analysing real propaganda
PurposeStudents apply analytical techniques to real examples of propaganda from different sources.
How to run itDescribe several examples of real propaganda without naming the source at first. For each, ask students to identify the techniques used. Then reveal the source. (1) 'Our country is surrounded by enemies! Only a strong leader can defend us! The media that criticises us are traitors to the nation!' (This general pattern has been used by many authoritarian regimes — students can match it to examples they know.) (2) A Soviet-era poster showing a heroic worker in stylised poses with slogans about the revolution. (3) A Western wartime poster dehumanising an enemy nation. (4) A modern political advertisement promising that an opponent will destroy the country. (5) A commercial advertisement playing on status anxiety or body image. After each, discuss: which techniques are used? How do they try to make you feel rather than think? Is the difference between 'propaganda' and 'advertising' or 'political campaigning' clear, or blurry? The exercise is to show that propaganda techniques are not only used by dictators — they are embedded in much modern communication. Learning to spot them is a defence wherever they appear.
💡 Low-resource tipTeacher describes posters and messages verbally. No visual materials needed, though printed or projected images help if available.
Activity 2 — How disinformation spreads
PurposeStudents understand the mechanics of modern disinformation and how it exploits digital platforms.
How to run itWalk through the typical pattern of a modern disinformation campaign. (1) Content creation: a false story is created — sometimes by a state-sponsored group, sometimes by a commercial 'content farm' seeking ad revenue, sometimes by ideologically motivated individuals. (2) Seeding: the content is posted on multiple accounts across platforms. (3) Amplification: bots and coordinated accounts share the content, making it appear popular. Algorithms then amplify it further because of high engagement. (4) Legitimisation: the content reaches genuine users, who may share it. It may be picked up by small media outlets, then larger ones. (5) Persistence: even when debunked, the original claim often spreads further than any correction ('lies travel faster than the truth'). Present a case study. The 'Pizzagate' conspiracy theory in the US (2016): a completely false claim about a pizza restaurant, amplified across social media, led to an armed man entering the restaurant. Or: COVID-19 vaccine disinformation, which was deliberately amplified by multiple state and non-state actors. Ask: at what points in this chain could disinformation be stopped? What role do platforms play? What role do individual users play? What role do governments play? What are the limits of each kind of intervention?
💡 Low-resource tipTeacher describes the process verbally. Students discuss in groups. No materials needed.
Activity 3 — AI, deepfakes, and the future of truth
PurposeStudents engage with emerging challenges to the information environment from generative AI.
How to run itPresent the emerging problem. Generative AI can now produce convincing fake text, images, audio, and video at low cost and enormous scale. This has several implications. (1) Fake content: we can no longer assume that a video showing someone saying something is real. Deepfakes of politicians, celebrities, and ordinary people already exist. (2) Scale: AI can generate millions of fake articles, comments, and posts, flooding the information environment. (3) Tailoring: AI can generate disinformation tailored precisely to specific audiences, exploiting their particular beliefs and biases. (4) The 'liar's dividend': once deepfakes are common, genuine evidence of wrongdoing can be dismissed as 'fake' — even when it is real. This creates advantages for wrongdoers. (5) Scepticism collapse: if we can trust nothing, we may end up trusting either tradition, charismatic leaders, or nothing at all. None of these is good for democracy. Ask students: what defences exist? Content provenance (digital 'watermarks' on AI content); trusted institutions (major news organisations, scientific consensus); individual scepticism habits; legal regulation of AI; platform moderation. Are any of these sufficient? Is the problem technical, legal, social, or all three? Discuss: is there a real risk of an information environment in which truth itself becomes impossible to establish? What would a society look like in which this happened?
💡 Low-resource tipTeacher presents the issues verbally. Students discuss in groups. No materials needed.
Discussion Questions
  • Q1Fact-checking has grown massively over the past decade but disinformation has arguably grown faster. Are fact-checkers fighting a battle they cannot win? If so, what else is needed?
  • Q2Research on motivated reasoning suggests people rarely change their minds when confronted with evidence against their beliefs. What does this mean for how to address disinformation?
  • Q3Should governments regulate disinformation — and if so, how can this be done without itself becoming censorship?
  • Q4Large platforms have enormous power over what millions of people see. Should their moderation decisions be subject to democratic oversight, and how could this work?
  • Q5The line between propaganda and legitimate political communication is not always clear. A political party's advertising uses many of the same techniques. Where is the line?
  • Q6Authoritarian states invest heavily in information operations abroad. Democracies typically do not — or claim not to. Is this asymmetry sustainable? Should democracies respond in kind?
  • Q7Young people are often described as more media-literate than older people — but also as more vulnerable to TikTok-era disinformation. Which is right, and what does the evidence show?
Writing Tasks
Task 1 — Extended essay
'The greatest threat to democracy today is not authoritarian governments but the collapse of a shared information environment.' To what extent do you agree? Write 400 to 600 words.
Skills: Thesis-driven argument, engaging with competing threats, connecting disinformation to democratic theory
Task 2 — Analytical response
Explain the difference between misinformation, disinformation, and malinformation, and give an example of each. Write 200 to 300 words.
Skills: Explaining related concepts, distinguishing them, giving concrete examples
Common Misconceptions
Common misconception

Propaganda is something only authoritarian governments do.

What to teach instead

Propaganda techniques are used by many actors — governments of all kinds, political parties, businesses, and advocacy groups. The Nazi and Soviet regimes built the most extensive propaganda machines in history, but democratic governments have used propaganda too (especially during wartime), and commercial advertising uses many of the same techniques. The distinction between 'propaganda' (bad, done by them) and 'communication' (good, done by us) is often a rhetorical move rather than an analytical one. Recognising propaganda techniques in all their forms — including those used by people we agree with — is the beginning of genuine media literacy.

Common misconception

Fact-checking can solve the disinformation problem.

What to teach instead

Research consistently shows that fact-checking has limited effectiveness, especially when the original claim fits what people already want to believe. Motivated reasoning means people process information through the lens of their existing beliefs and identities, not purely through evidence. Fact-checking corrects the factual record but often does not change minds. More effective approaches include prebunking (showing people how manipulation works before they encounter it), rebuilding trust in institutions, and addressing the underlying conditions (economic anxiety, social alienation) that make disinformation attractive. Fact-checking matters but cannot do the job alone.

Common misconception

Young 'digital native' people are better at spotting disinformation than older people.

What to teach instead

The evidence is more complicated. Younger people are more confident in their media literacy, but studies often find they are no better — and sometimes worse — at actually identifying false information than older people. Older people are more likely to share disinformation (perhaps because they share more generally on the platforms they use). Younger people may be more susceptible to certain platform-specific forms (TikTok-based conspiracy theories). Being 'digital native' is not the same as being media-literate; media literacy requires active learning and practice, not just familiarity with technology.

Common misconception

Regulating disinformation is the same as censorship.

What to teach instead

The relationship between regulating disinformation and censorship is genuinely difficult, but they are not the same thing. Censorship typically means the state prohibiting expression of particular ideas. Regulating disinformation might include: transparency requirements (platforms must disclose advertising sources); anti-manipulation rules (against coordinated inauthentic behaviour); provenance standards (labelling AI-generated content); and strict rules against specific harms like election fraud or health misinformation that creates imminent danger. Reasonable people disagree about where these become censorship, but the claim that any regulation is censorship shuts off discussion of legitimate responses to genuine harms.

Further Information

Key texts accessible to students: Edward Bernays, 'Propaganda' (1928) — a remarkably frank account of modern persuasion from one of its pioneers. Hannah Arendt, 'The Origins of Totalitarianism' (1951) — essential for understanding totalitarian propaganda. Jacques Ellul, 'Propaganda: The Formation of Men's Attitudes' (1965) — the most systematic theoretical account. For modern disinformation: Peter Pomerantsev, 'This Is Not Propaganda' (2019) — vivid and accessible. Yochai Benkler, Robert Faris, and Hal Roberts, 'Network Propaganda' (2018) — the best empirical study of the American disinformation ecosystem. Renée DiResta's writings on platforms and disinformation are excellent and accessible. For the AI challenge: Nina Schick, 'Deepfakes' (2020). For practical media literacy: the work of Mike Caulfield, particularly his SIFT method (Stop, Investigate the source, Find better coverage, Trace claims). Resources: the International Fact-Checking Network (ifcncodeofprinciples.poynter.org) lists accredited fact-checkers worldwide. The Shorenstein Center at Harvard publishes extensively on disinformation. First Draft News (firstdraftnews.org) and the Berkman Klein Center have useful resources for students.