All Skills
Thinking Skills

Media Literacy

How to find, evaluate, and think critically about the information you encounter — in news, on social media, and in everyday sources. In a world of abundant information and deliberate misinformation, knowing how to tell the difference between reliable and unreliable sources is one of the most urgent skills of our time.

Key Ideas at This Level
1 Not everything we hear or see is true.
2 We can ask questions to find out if something is true.
3 Stories can be told in different ways — and the same thing can look very different.
4 People share information for different reasons — sometimes to help, sometimes to confuse or mislead.
5 It is good to check before we believe or repeat something.
Teacher Background

Media literacy at Early Years level is about building the foundational habit of questioning information before accepting it — recognising that not everything seen or heard is necessarily true, and that asking questions is a sign of intelligence, not doubt or distrust. Young children are learning to distinguish fantasy from reality, to understand that adults sometimes disagree, and to notice that the same event can be described very differently by different people. These are the building blocks of media literacy. In low-connectivity contexts, the most relevant information environments for young children may not be social media but community talk, radio, local media, and the stories told by adults around them. Media literacy at this level should engage with these genuinely local information environments rather than assuming digital media is the only relevant context. The most important message at this level: it is good and brave to ask is that true? It is wise to check before you believe, and wise to check before you repeat. Children who develop this habit early are significantly more resistant to misinformation throughout their lives. No materials are needed for any activity below.

Skill-Building Activities
Activity 1 — Is it true? The checking habit
PurposeChildren build the foundational habit of questioning information before accepting it — and discover that checking is both possible and important.
How to run itTell children three short statements — one clearly true, one clearly false, one uncertain. Example: the sun rises in the east (true). There is a dragon living near our school (false). There will be no rain this week (uncertain). Ask: how do you know which is true? What would you do to find out? Introduce the idea: some things we know because we have seen them ourselves. Some things we have been told by people we trust. Some things we are not sure about. And some things people tell us that are not true. Now play a game: the teacher says something and children decide true, false, or not sure. Discuss how they decided. Introduce the key question: how could you check? Explore checking methods appropriate to their context: asking someone who knows, looking at the thing itself, asking two different people and seeing if they agree, looking at what actually happened. Now introduce the idea of why people sometimes say things that are not true: sometimes by accident (they did not know it was wrong), sometimes because they believed something that turned out to be false, sometimes to make themselves seem important, sometimes to trick or frighten someone. Ask: is it different if someone tells you something false by accident versus on purpose?
💡 Low-resource tipNo materials needed. Use statements from genuinely local contexts — local facts, local rumours, local uncertainties. The checking methods discussed should be ones that children can actually use in their context, not ones that require technology they do not have access to.
Activity 2 — The same story, told differently
PurposeChildren discover that the same event can be described in very different ways — and that how a story is told shapes how we understand it.
How to run itTell the same simple event in two different ways and ask children what they notice. Event: two children were arguing over a toy and one child pushed the other. Version A: a child attacked another child without any reason, pushing them to the ground. Version B: two children had a disagreement and one of them pushed the other while they were both upset. Ask: are these the same story? Which version makes the pushing child seem worse? Which makes the situation seem more understandable? What is different about the words used? Now try a third version: a child was being bullied and finally stood up for themselves. Ask: which version is true? Could more than one be true? How would you find out what really happened? Introduce the idea: when we hear a story about something that happened, the person telling it always makes choices — about what to include, what to leave out, and which words to use. Those choices change how we feel about what happened. This does not mean all versions are equally true — it means we should think about who is telling the story and what they might be leaving out.
💡 Low-resource tipNo materials needed. Use a genuine recent incident from the class or community (handled sensitively, without identifying individuals) or an invented scenario. The three-version structure — two extreme versions and one more balanced — makes the lesson very clear.
Activity 3 — Why do people share information? Purpose and motive
PurposeChildren begin to think about the reasons people share information — understanding that purpose and motive affect the reliability and usefulness of what is shared.
How to run itAsk children: when you tell someone something, why do you do it? Collect answers: to be helpful, because it is interesting, because you are worried, because you want to warn someone, because you want to impress them, because it is funny, because you were asked. Now discuss: do you share information differently depending on why you are sharing it? If you want to warn someone, do you make things sound more dangerous than they are? If you want to impress someone, do you make things sound more exciting? Introduce the idea: the reason someone is sharing information can affect how they tell it. This does not mean they are lying — but it means it is useful to think about why someone is sharing something. Now give three examples of information shared for different reasons and ask children to think about how the reason might affect the content. A friend tells you that a certain path is dangerous at night — why might they be telling you this? A trader at the market tells you that their product is the best and cheapest — why are they telling you this? A community elder tells you about what happened during a difficult time in the past — why are they telling you this? Does the reason change whether you believe them?
💡 Low-resource tipNo materials needed. Use genuinely local examples of information sharing — community announcements, market conversations, stories shared between neighbours. The more familiar the context, the more the lesson connects to children's real information environment.
Reflection Questions
  • Q1Has someone ever told you something that turned out not to be true? How did you find out? How did it feel?
  • Q2When an adult tells you something, do you always believe it? Should you? How do you decide?
  • Q3Have you ever repeated something you heard and then found out it was wrong? What happened?
  • Q4Are some people more trustworthy sources of information than others? Who do you trust most and why?
  • Q5Why do you think stories about scary or surprising things spread so quickly?
Practice Tasks
Drawing task
Draw two people telling different versions of the same story. Write or say: they are telling the story of __________, and one person says __________ while the other person says __________.
Skills: Building awareness that the same event can be described differently — the foundation of critical media awareness
Model Answer

Two figures, each with a speech bubble containing a different version of the same event. The completion shows a genuine difference between the two versions — not just different words but a genuinely different interpretation or emphasis. Celebrate drawings that show the tellers in different positions relative to the event (one was there, one was not; one is older, one is younger).

Marking Notes

Ask: which version do you believe? How would you find out what really happened? The checking question is as important as the recognition of difference.

Reflection task
Write or say: before I believe something I hear, I will ask __________. And before I repeat something, I will check __________.
Skills: Building the checking habit as a personal commitment — connecting media literacy to everyday information behaviour
Model Answer

Before I believe something I hear, I will ask: did the person telling me see it themselves, or did they hear it from someone else? And before I repeat something, I will check: is this true and am I sure, or am I only passing on something I heard without knowing if it is right?

Marking Notes

The specificity of the checking question is the most important thing. Vague commitments to check (I will think about it) are less valuable than specific questions (did the person who told me see it themselves? Can I find out from another source?).

Common Mistakes
Common misconception

If many people believe something, it must be true.

What to teach instead

Many people can believe the same false thing — especially when that thing spreads quickly through a community or when it confirms what people already want to believe. The number of people who believe something does not determine whether it is true. Throughout history, large numbers of people have believed things that turned out to be false. The question to ask is not how many people believe it but what evidence supports it.

Common misconception

Adults always tell the truth.

What to teach instead

Adults are generally more reliable sources of information than rumour or guesswork — they often have more experience and knowledge. But adults also make mistakes, believe things that are not true, have reasons to tell stories in particular ways, and sometimes deliberately mislead. Respecting adults does not mean accepting everything they say without thought — it means giving their information appropriate weight while still thinking carefully about it. Questioning a claim is not the same as disrespecting the person who made it.

Common misconception

Checking whether something is true means you do not trust the person who told you.

What to teach instead

Checking whether something is true is not an act of distrust towards any person — it is a good habit that good information-users practise regardless of who the source is. Even people we trust completely can be wrong — they may have received false information themselves, they may be misremembering, or they may not have had access to the full picture. Checking is a sign of intellectual care, not personal suspicion.

Key Ideas at This Level
1 What media is and how information reaches us
2 The difference between news, opinion, and advertising
3 Evaluating sources — who is telling this and why?
4 Misinformation and disinformation — why false information spreads
5 Lateral reading — how fact-checkers actually check facts
6 Our own biases — how what we already believe affects what we believe
Teacher Background

Media literacy at primary level introduces students to a structured framework for evaluating information — distinguishing between news, opinion, and advertising; asking critical questions about sources and motives; understanding how misinformation spreads and why it is so effective; and developing practical verification habits.

News, opinion, and advertising

Students often struggle to distinguish between these three categories of content. News aims to report verifiable facts about events. Opinion aims to express and argue for a point of view. Advertising aims to persuade people to buy or do something. These three categories often appear together in the same newspaper, website, or broadcast — and the boundaries between them are often deliberately blurred. Students who can identify which category a piece of content belongs to are significantly better equipped to evaluate it appropriately.

Source evaluation

The most important question about any piece of information is who is telling me this and why? Primary sources (people who witnessed or participated in an event) are generally more reliable than secondary sources (people reporting what someone else said happened). Expert sources (people with relevant knowledge and track records) are generally more reliable than non-expert sources on specialist topics. Sources with transparent funding and editorial standards are generally more reliable than anonymous or opaque sources. But all of these are heuristics, not rules — there are excellent anonymous sources and terrible expert sources.

Lateral reading

Research by Sam Wineburg and the Stanford History Education Group shows that the most effective fact-checking technique is lateral reading — opening multiple browser tabs about an unfamiliar source and reading what others say about it, rather than reading the source itself deeply. Expert fact-checkers spend very little time reading the content of a suspicious source and much more time checking the source's reputation, funding, and track record. This is counterintuitive but highly effective.

Misinformation and disinformation

Misinformation is false information spread without deliberate intent to deceive; disinformation is false information spread with deliberate intent to deceive. The distinction matters for moral assessment but both produce the same harm. False information spreads faster and further than true information, particularly on social media, because it tends to be more emotionally arousing — more surprising, more frightening, more outrageous. Understanding this helps students resist the pull of emotionally compelling false information.

Key Vocabulary
Media
The various ways that information is communicated to large audiences — including newspapers, radio, television, social media, websites, and community communication channels.
Misinformation
False or inaccurate information spread without deliberate intent to deceive — the sharer believes it is true. Misinformation causes harm even without malicious intent.
Disinformation
False information spread deliberately with the intent to deceive or manipulate — the sharer knows it is false. Disinformation is a deliberate act of deception.
Source
The origin of a piece of information — the person, organisation, or document from which information comes. Evaluating the source is the most important step in evaluating any piece of information.
Primary source
A source with direct, firsthand knowledge of an event — a witness, a participant, an original document. Primary sources are generally more reliable than secondary sources for factual claims about events.
Confirmation bias
The tendency to believe information that confirms what we already think and to dismiss information that challenges it. Confirmation bias makes us all more vulnerable to misinformation that matches our existing beliefs.
Clickbait
Content designed to attract attention and provoke clicks through misleading, sensational, or emotionally provocative headlines — often without delivering the substance implied by the headline.
Lateral reading
A fact-checking technique in which you open multiple sources to investigate who is behind a piece of content or a claim — reading across sources about the source, rather than reading the source deeply. The most effective technique used by professional fact-checkers.
Skill-Building Activities
Activity 1 — Sorting the content: news, opinion, or advertising?
PurposeStudents learn to distinguish between the three most important categories of media content — news, opinion, and advertising — and understand how the boundaries are blurred.
How to run itCollect or describe six to eight pieces of content from available local sources — newspaper clippings, radio broadcast transcripts, notices, community announcements, or invented examples. For each one, ask students to classify it as news (reporting verifiable facts), opinion (arguing for a point of view), or advertising (persuading you to do or buy something). Discuss each classification: what are the clues? Words that reveal opinion: should, must, clearly, obviously. Features of advertising: focus on benefits, calls to action, little acknowledgement of drawbacks. Features of news: multiple sources, quotes, dates, attribution. Now introduce the blurred cases: an opinion column that uses factual evidence. An advertisement that contains true information. A news story that uses language that implies editorial judgment. A community announcement from an organisation with a stake in the outcome. Ask: does it matter which category something belongs to? How does knowing the category change how you read it? Connect to their actual information environment: which category is most common in the information they encounter each day?
💡 Low-resource tipWorks entirely with locally available examples — newspaper pages, radio transcripts, market notices, community posters. No technology needed. If no printed materials are available, the teacher can read examples aloud and students classify verbally.
Activity 2 — Why does false information spread? The viral story experiment
PurposeStudents understand the psychological mechanisms that make false information spread faster than true information — building resistance to the emotional pull of misinformation.
How to run itPresent four pieces of information and ask students to rate how likely they are to share each one (one to five). Story 1: a true but unremarkable local news story — a road has been repaired in a nearby area. Story 2: an emotionally arousing but false story — a dangerous animal has been spotted near the school. Story 3: a true story that confirms widely held beliefs — local market prices have risen again. Story 4: a false story that challenges authority — local officials have been caught behaving badly. After rating, discuss: which stories were most likely to be shared? Why? Introduce the research finding: false information consistently spreads faster and further than true information — because it tends to be more novel, more emotionally arousing (particularly fear and anger), and more surprising. Our brains are wired to pay special attention to threatening or surprising information — it was useful for survival. Ask: knowing this, how should you respond when you encounter information that makes you feel very angry, very frightened, or very surprised? Should you share it immediately or pause? Introduce the pause before sharing habit: when information triggers a strong emotional response, that is a signal to check, not to share.
💡 Low-resource tipWorks entirely through discussion. The four examples can be adapted to local context and spoken rather than written. The key insight — that emotional arousal is a signal to check, not to share — is the most important practical takeaway and requires no technology.
Activity 3 — Evaluating a source: who is telling this and why?
PurposeStudents practise the most fundamental media literacy skill — systematically evaluating the source of information before deciding how much to trust the content.
How to run itIntroduce four questions to ask about any source of information. Who is behind this? Is the author, publisher, or organisation clearly identified? What do I know about them — are they experts, do they have a track record, are they transparent about who they are and what they stand for? What is their purpose? Are they trying to inform, persuade, sell, or entertain? What is their incentive regarding this specific piece of information — do they benefit from you believing it? What is their evidence? Do they cite their sources? Are the sources accessible? Are the people quoted real and do their quotes check out? What do others say about this source? If I search for this organisation or author, what do I find? Do reliable sources trust them? Now apply these four questions to three contrasting examples adapted to local context. Example 1: information from a national public health authority. Example 2: information from an anonymous social media account. Example 3: information from a local community leader or elder. The analysis of example 3 is particularly valuable — trusted community sources deserve respect and still deserve evaluation. Ask: is there anyone whose information you would never question? Is that a good policy?
💡 Low-resource tipThe four questions can be written on the board and applied to spoken examples if no printed materials are available. The evaluation of genuinely local sources — community announcements, radio programmes, trusted individuals — is more valuable than applying the framework to remote examples students cannot check independently.
Reflection Questions
  • Q1Think of a piece of false information that spread in your community. How did it start? Why did people believe it? What did it take for people to stop believing it?
  • Q2Is there information you have shared and later discovered was false or misleading? How did you feel? What did you do?
  • Q3Why do you think people sometimes prefer information that confirms what they already believe over information that challenges it?
  • Q4Who are the most trustworthy sources of information in your community? What makes them trustworthy? Could they ever be wrong?
  • Q5If a piece of information makes you feel very angry or very frightened, should that affect whether you share it? How?
  • Q6Is all information that is technically true necessarily not misleading? Can true information be used to deceive?
Practice Tasks
Task 1 — Fact-check a local claim
Choose a claim or story that has circulated in your community or that you have heard recently. Apply the source evaluation framework: (a) who is the source? (b) what is their purpose? (c) what evidence do they offer? (d) what do other sources say? (e) your conclusion: reliable, unreliable, or uncertain — and why. Write 4 to 6 sentences.
Skills: Applying the source evaluation framework to a real local claim — practising the practical media literacy skill that is most directly useful
Model Answer

The claim I heard was that a new medicine was being tested in our area that could cause serious side effects, and that families should refuse to allow their children to participate. The source was a message shared between parents on a community phone network — the original sender was not identified. The purpose appeared to be to warn the community, but the message created significant fear without offering specific information about what the medicine was or where the information came from. No evidence was offered — no named organisation, no document, no named person with medical knowledge. When I asked a health worker at the local clinic, she said she had no information about any such trial and that the message appeared to be false. My conclusion: this claim is unreliable. The anonymity of the source, the absence of specific verifiable details, and the strong emotional effect (fear) are all warning signs. The pause-and-check habit would have prevented unnecessary distress if applied before sharing.

Marking Notes

Award marks for: a genuine and locally relevant claim; honest application of all four evaluation criteria; a conclusion that is consistent with the evidence gathered; and an explanation of the reasoning rather than just a verdict. Strong answers will identify the specific features that make the source more or less reliable rather than making a general judgment.

Task 2 — Analyse a misinformation event
Describe a real example of misinformation that spread in your community, country, or the world. Write: (a) what the false claim was; (b) why it spread — using what you have learned about how misinformation works; (c) what harm it caused; (d) what eventually corrected it — or why it was not corrected. Write 4 to 6 sentences.
Skills: Applying media literacy concepts to a real misinformation event — understanding misinformation as a systemic problem rather than only an individual error
Common Mistakes
Common misconception

You can tell if something is false by whether it looks professional or well-written.

What to teach instead

Some of the most effective disinformation is highly professional, well-written, and visually sophisticated — because producers of disinformation know that these qualities increase credibility. Conversely, reliable information can be informal, imperfectly presented, and locally produced. The quality of the presentation tells you nothing reliable about the accuracy of the content. The relevant questions are about source, evidence, and corroboration — not about how polished the presentation is.

Common misconception

Misinformation is mainly a problem created by unintelligent or uneducated people.

What to teach instead

Research consistently shows that intelligence and education provide limited protection against misinformation — and in some cases, educated people are more susceptible to sophisticated misinformation because they are more confident in their ability to evaluate it. Misinformation exploits universal features of human cognition — confirmation bias, emotional responsiveness, social trust — that affect everyone. The people who are best at resisting misinformation are not necessarily the most intelligent but those who have developed specific habits of verification and those who are most aware of their own vulnerability.

Common misconception

Once you know that a piece of information is false, you stop believing it.

What to teach instead

Research on the continued influence effect shows that corrections to false beliefs are often less effective than hoped — people continue to be influenced by information they know to be false, particularly when the false information was vivid, emotionally engaging, or consistent with pre-existing beliefs. Effective correction requires not only stating that the information is wrong but explaining why it was wrong, providing an accurate alternative narrative, and repeating the correction multiple times. A single correction is rarely sufficient.

Common misconception

Being media literate means being sceptical about everything.

What to teach instead

Universal scepticism — doubting everything — is as problematic as naive credulity. Excessive scepticism prevents people from accepting reliable information, creates paralysis, and can itself be exploited by those who want to prevent belief in true and important facts (about climate change, public health, justice). The goal of media literacy is calibrated scepticism — applying scrutiny proportional to the importance of the claim, the reliability of the source, and the potential cost of being wrong. Strong evidence from reliable sources should produce genuine belief. Weak evidence from unreliable sources should produce scepticism.

Key Ideas at This Level
1 The information ecosystem — how information flows, who controls it, and who is excluded
2 The psychology of belief — why we believe what we believe
3 Disinformation as a political tool — how false information is used to manipulate
4 Algorithmic curation — how social media shapes what we see and believe
5 Journalism and its norms — how news is produced and what standards it should meet
6 Media ownership and power — who controls the information environment
Teacher Background

Secondary media literacy engages students with the systemic dimensions of information — how information ecosystems are structured, who has power over them, how they can be manipulated, and what democratic information environments require. The information ecosystem: information does not flow neutrally through society. It is produced, selected, amplified, and suppressed by institutions and individuals with specific interests, values, and resources. Understanding the information ecosystem means understanding who produces information, who distributes it, who benefits from its spread, who is excluded from the conversation, and what structural conditions shape what information is available.

Algorithmic curation

Social media platforms use algorithmic systems to determine which content each user sees — prioritising content that is most likely to keep the user on the platform (which tends to be emotionally arousing, identity-affirming, and novel). This produces filter bubbles (seeing only information consistent with existing beliefs) and echo chambers (interacting primarily with people who share your views). Research on filter bubbles is more mixed than the initial claims suggested — people are exposed to more cross-cutting information than the filter bubble hypothesis implies — but algorithmic amplification of emotionally arousing content (particularly outrage) is well-documented.

Disinformation as a political tool

Deliberate disinformation campaigns — by governments, political actors, and commercially motivated agents — are well-documented features of the current information environment. The most effective disinformation campaigns exploit existing social divisions, amplify extreme voices, and create confusion rather than specific false beliefs. Documented examples include Russian information operations during the 2016 US election, disinformation about COVID-19 vaccines, and systematic false information about electoral integrity.

Media ownership

In most countries, the media is owned by a small number of individuals and corporations whose interests and values inevitably shape what is produced, emphasised, and ignored. The concentration of media ownership is associated with reduced diversity of perspective, less investigative journalism, and greater alignment of editorial content with owner interests. Public service media (funded by government but editorially independent) is one structural response to this problem.

Key Vocabulary
Information ecosystem
The full system through which information is produced, distributed, consumed, and acted upon in a society — including media organisations, social media platforms, community networks, educational institutions, and informal communication.
Filter bubble
A state in which algorithmic personalisation limits a user's exposure to information that challenges their existing views — creating a bubble of self-confirming content. The filter bubble concept has been critiqued as overstated but algorithmic amplification of identity-affirming content is documented.
Echo chamber
A social environment in which a person encounters only opinions and information that reflect and reinforce their own — usually through deliberate or habitual selection of like-minded sources and communities.
Algorithmic curation
The automated selection and ranking of content for individual users by social media platforms and search engines — using machine learning to predict and amplify content that maximises engagement.
Information operation
A coordinated campaign to influence public opinion through deliberate manipulation of the information environment — using false information, amplification of real but extreme content, or impersonation of authentic voices.
Prebunking
A counter-disinformation technique that inoculates people against specific false information before they encounter it — by explaining the misleading technique being used, rather than correcting specific false claims after the fact.
Media ownership
Who owns and controls media organisations — determining editorial priorities, coverage decisions, and the values reflected in media output. Concentration of media ownership reduces diversity of perspective and increases the power of a small number of individuals to shape public information.
Public service media
Media organisations funded by the public (through licence fees, taxes, or other means) and editorially independent of both government and commercial pressure — such as the BBC, ABC, and equivalents in many countries. Public service media is one structural response to the problems of commercial media.
Epistemic autonomy
The ability to form your own beliefs through your own reasoning — free from manipulation, excessive influence, or information environments designed to produce specific beliefs. A healthy democracy requires citizens with epistemic autonomy.
Continued influence effect
The research finding that false information continues to influence beliefs and judgments even after it has been corrected — because the false information was processed and stored, and corrections are less vivid and less repeated than the original false claim.
Skill-Building Activities
Activity 1 — How algorithms shape what you believe: understanding the curation machine
PurposeStudents understand how social media algorithms determine what information they see — and what the consequences are for their beliefs, their social connections, and democratic life.
How to run itBegin with the question: who decides what information you see on social media? Most students will say they choose — they follow accounts they like, they search for content they want. Introduce the reality: algorithmic curation systems make the vast majority of decisions about what appears in any individual's feed, based on a prediction of what will maximise engagement (time on platform, clicks, shares, comments). These systems have no interest in the truth, accuracy, or social value of the content — only in its ability to hold attention. Present three documented consequences. Consequence 1 — Emotional amplification: content that provokes strong emotions — particularly outrage and fear — generates more engagement than calm, nuanced content. Algorithms therefore systematically amplify the most emotionally extreme content. Consequence 2 — Identity affirmation: content that confirms existing identity and beliefs generates more engagement than content that challenges them. Algorithms therefore tend to reduce exposure to challenging perspectives. Consequence 3 — Novelty bias: new and surprising information generates more engagement than familiar true information. Algorithms therefore amplify novel false information at the expense of accurate but familiar information. Ask: knowing this, how should you engage with social media as an information source? What practices would help you resist algorithmic manipulation of your beliefs? Now ask the harder question: is this a problem individuals can solve through better personal media habits — or does it require structural solutions (regulation, platform redesign)? What would structural solutions look like?
💡 Low-resource tipWorks entirely through discussion. Students in low-connectivity settings may have limited direct experience of algorithmic social media — adapt to the information environment they actually use, including radio, television, community information networks, and messaging applications.
Activity 2 — Disinformation as a political tool: case study analysis
PurposeStudents analyse a documented disinformation campaign — understanding the techniques used, the targets, and the real-world consequences — developing the specific pattern recognition needed to resist disinformation.
How to run itPresent a documented disinformation campaign adapted to the local or regional context. If local examples are unavailable, use one of the following well-documented cases: the spread of disinformation about COVID-19 vaccines in multiple countries; the documented interference in the 2016 US election through social media manipulation; the use of false information to inflame religious or ethnic tensions in various countries in the 2010s. For each case, analyse five questions. What was the goal? (To produce specific beliefs? To create confusion? To suppress turnout? To inflame division?) Who was behind it? (State actors, commercial actors, political campaigns, or a combination?) What techniques were used? (False stories, amplification of real extreme content, impersonation, fabricated images or video?) Who was most successfully targeted and why? (What made some communities more vulnerable than others?) What were the real-world consequences? (Violence, changed votes, vaccine hesitancy, erosion of trust?) Now ask: what would make any community more or less resistant to disinformation campaigns? What conditions — social, political, educational, technical — reduce vulnerability? What conditions increase it?
💡 Low-resource tipWorks entirely through discussion and analysis. Use genuinely local or regional examples where available — the disinformation environment is different in different parts of the world and the most relevant examples are those closest to students' actual experience.
Activity 3 — Who controls the information? Media ownership and power
PurposeStudents examine the structural dimension of media literacy — who owns and controls information environments and what follows for what information is available, emphasised, or suppressed.
How to run itBegin with the question: who owns the main media in your country? (This may be government, private corporations, individuals, religious organisations, or a mixture.) Investigate together if information is available — or estimate based on what students know. Now ask four questions about ownership and its implications. Question 1: do the owners of media have interests that might shape what is covered and how? (A media outlet owned by a mining company might cover environmental news differently from one without that conflict of interest.) Question 2: is there any media in your country that is genuinely editorially independent — not owned by government or by commercial interests that conflict with honest reporting? Question 3: who is excluded from the media — whose stories are not told, whose perspectives are not represented, whose communities are invisible in mainstream coverage? Question 4: what is the difference between a government-controlled media and a public service media that is government-funded but editorially independent? How would you tell which kind you have? Now connect to the democracy question: a democracy requires citizens who can access accurate, diverse information about their society and their government. What conditions in your country support or undermine this? What would a healthier information ecosystem look like?
💡 Low-resource tipWorks entirely through discussion. The analysis of local media ownership will vary enormously by country — some students will have direct knowledge of who controls media in their country; others will not. The framework of questions is more important than specific answers.
Reflection Questions
  • Q1Research shows that false information spreads faster than true information on social media. Is this a problem that individuals can solve through better media habits — or does it require structural solutions? What would those solutions look like?
  • Q2Filter bubbles and echo chambers are said to be reducing exposure to diverse perspectives and increasing political polarisation. How much evidence is there for this in your own information environment?
  • Q3Prebunking — warning people about disinformation techniques before they encounter them — is more effective than debunking after the fact. What does this tell us about how to design media literacy education?
  • Q4Fact-checking organisations have proliferated in response to the misinformation problem. How effective are they? Who uses them and who does not — and why?
  • Q5Some governments argue that controlling disinformation requires platform regulation or content moderation — which others argue is censorship. Where is the line between protecting an information environment and suppressing legitimate speech?
  • Q6Is your country's media environment healthy? What evidence do you use to judge? What would need to change for it to be healthier?
Practice Tasks
Task 1 — Evaluate your information environment
Write an honest analysis of your own information environment. Include: (a) your main sources of news and information; (b) the ownership and editorial values of each; (c) what perspectives are well-represented and what is missing or underrepresented; (d) how much your information environment is algorithmically curated; (e) one change you could make to improve the diversity and reliability of your information intake. Write 300 to 400 words.
Skills: Applying media literacy concepts to personal information practices — developing honest self-awareness about information consumption
Task 2 — Essay: information and democracy
Choose ONE of the following questions and write a 400 to 600 word essay. (a) A healthy democracy requires a healthy information environment. Is the current information environment — in your country or globally — compatible with healthy democracy? (b) Social media platforms should be regulated to reduce the spread of disinformation, even if this requires content moderation that some argue is censorship. Do you agree? (c) Individual media literacy education is necessary but insufficient to address the misinformation crisis — structural changes to the information ecosystem are required. Do you agree?
Skills: Constructing a reasoned argument about the relationship between information, power, and democratic life
Common Mistakes
Common misconception

Fact-checking organisations are the solution to the misinformation problem.

What to teach instead

Fact-checking is valuable but has significant limitations as a response to large-scale misinformation. Fact-checks reach a much smaller audience than the original false claim — most people who see misinformation never see the fact-check. The continued influence effect means that even people who see a fact-check often remain influenced by the original false information. Fact-checks can sometimes amplify false claims by repeating them. And the volume of false information now far exceeds the capacity of any fact-checking infrastructure to address it. Fact-checking is one useful tool among many, not a comprehensive solution.

Common misconception

Mainstream media is always more reliable than independent or social media.

What to teach instead

Mainstream media varies enormously in reliability, editorial standards, and independence from commercial and political pressure. Some mainstream media is excellent; some is unreliable, partisan, or captured by powerful interests. Some independent media has very high editorial standards. The relevant questions are about editorial independence, transparency, track record for accuracy, and correction practices — not about whether a source is classified as mainstream. In many countries, the most reliable journalism is done by organisations that are not part of the mainstream media establishment.

Common misconception

The solution to the misinformation problem is to teach everyone to be more sceptical.

What to teach instead

Increasing general scepticism is as likely to be exploited by disinformation as to resist it. Disinformation campaigns often deliberately target trust in reliable institutions — encouraging scepticism of established science, legitimate media, and electoral systems — so that false claims seem as plausible as true ones. The goal of media literacy is not maximum scepticism but calibrated scepticism — trusting reliable sources appropriately and being sceptical of unreliable ones. This requires the ability to distinguish between sources, which is a specific skill, not a general attitude of distrust.

Common misconception

The misinformation problem is mainly caused by social media and would be solved by better platform regulation.

What to teach instead

Misinformation long predates social media and existed in print, radio, and television media for decades. Social media has significantly accelerated the speed and scale of false information spread — but the underlying cognitive, social, and political conditions that make misinformation effective are not created by social media. Platform regulation can help reduce algorithmic amplification of false content, but it cannot address confirmation bias, social identity motivations for believing false information, or deliberate state-sponsored disinformation. A complete response to misinformation requires action at multiple levels — individual, platform, regulatory, and social.

Further Practice & Resources

Key texts and resources: Sam Wineburg's Why Learn History (When It's Already on Your Phone) (2018, University of Chicago Press) is the most rigorous account of how professional fact-checkers actually evaluate information, with specific practical implications for education. His Stanford History Education Group's Civic Online Reasoning curriculum (sheg.stanford.edu) provides free, peer-reviewed media literacy lessons. Kate Starbird's research on crisis misinformation is available through her University of Washington lab and in many accessible articles. Whitney Phillips's This Is Why We Can't Have Nice Things (2015, MIT Press) examines the origins of trolling and online harassment as disinformation precursors. For the algorithmic dimension: Eli Pariser's The Filter Bubble (2011, Penguin) introduced the concept; Zeynep Tufekci's Twitter and Tear Gas (2017, Yale) provides the most nuanced analysis of social media and political mobilisation. For media ownership: Robert McChesney's The Political Economy of Media (2008, Monthly Review Press) provides the most comprehensive structural analysis. For prebunking: John Cook and Stephan Lewandowsky's research on inoculation theory is available through their respective university websites and in many freely available articles. For journalists and standards: the Reuters Institute Digital News Report (annual, freely available at digitalnewsreport.org) provides the most comprehensive global data on news consumption, trust, and misinformation. For local African context: the Africa Check website (africacheck.org) is one of the most reliable fact-checking organisations on the continent and publishes free educational resources. For practical classroom tools: the SIFT method (Stop, Investigate the source, Find better coverage, Trace claims) developed by Mike Caulfield is freely available and widely used.