All Skills
Thinking Skills

Critical Thinking

How to question, analyse, and evaluate ideas and information carefully rather than accepting them at face value — a practical skill for every classroom and every subject.

Key Ideas at This Level
1 It is good to ask 'why?' and 'how do you know?'
2 We should look carefully before we decide
3 People can be wrong — even people we trust
4 We should hear different ideas before we choose
5 It is okay to change your mind when you learn something new
Teacher Background

Critical thinking at Early Years level is about building the habit of pausing and questioning before accepting something as true. Children are naturally curious — the goal is to keep that curiosity alive and connect it to a few simple habits: look carefully, ask why, think about whether it makes sense, and hear other ideas before deciding. At this stage, critical thinking is taught through concrete, playful activities rather than abstract principles. Children do not need the phrase 'critical thinking' — they need to practise it. The most powerful teaching move is to model the habit yourself: say 'that is interesting — how do we know?' and 'let us look carefully before we decide.' In low-resource classrooms, all these activities work through talk and observation alone — no materials are needed. Avoid teaching children to be cynical or to distrust everything. The goal is thoughtful, curious engagement — not scepticism for its own sake.

Skill-Building Activities
Activity 1 — Look carefully: the observation game
PurposeChildren practise careful observation before drawing conclusions — the foundation of all critical thinking.
How to run itPlace a simple object where all children can see it — or describe one verbally. Ask children to tell you everything they can observe about it. Then ask: What do you think this object is used for? Accept several answers. Then ask: How do you know? What made you think that? Introduce a class rule: 'We do not guess — we look carefully first.' Ask children to look again. What did they miss the first time? Explain: looking carefully before deciding is one of the most important thinking skills. We often see what we expect to see rather than what is actually there.
💡 Low-resource tipUse any available object — a stone, a leaf, a piece of cloth. If nothing is available, describe an object verbally and ask children to picture it carefully before answering.
Activity 2 — But how do you know? (questioning claims)
PurposeChildren learn to ask for reasons rather than accepting claims at face value.
How to run itMake a series of simple claims and ask children to respond with: 'But how do you know?' Use claims such as: 'Dogs are the cleverest animals.' 'It will definitely rain tomorrow.' 'This is the best food in the world.' 'Everyone loves this story.' For each, ask: Is this definitely true? How could we find out? Could someone disagree? Celebrate children who ask the question — explain that this is exactly what good thinkers do. They do not just believe something because someone says it. They ask: what is the reason? This question can become a regular classroom routine.
💡 Low-resource tipNo materials needed. Teacher makes the claims and children respond. Consider making 'But how do you know?' a permanent class habit — display it on the wall if possible.
Activity 3 — Two sides: hearing before deciding
PurposeChildren practise hearing more than one view before forming an opinion.
How to run itPresent a simple question with two reasonable answers: 'Should we play inside or outside today?' or 'Was the wolf in the story bad or just hungry?' Ask one group to give reasons for one side, then ask another group to give reasons for the other. Ask the whole class: Did hearing the other reasons change your mind at all? Is it possible that both sides have something true in them? Explain: good thinkers always try to hear the other side before deciding. Changing your mind when you hear a good reason is a sign of strength, not weakness.
💡 Low-resource tipUse any simple question relevant to classroom life or a story just read. No materials needed.
Reflection Questions
  • Q1What is the best question to ask when someone tells you something surprising?
  • Q2Have you ever changed your mind about something? What made you change it?
  • Q3Can grown-ups be wrong? Can you think of an example?
  • Q4What is the difference between guessing and knowing?
  • Q5Why is it important to listen to someone you disagree with?
Practice Tasks
Drawing task
Draw a thinker asking a question. Write or say: A good thinker always ___________.
Skills: Connecting a self-image as a thinker to a specific thinking habit
Model Answer

Any drawing of a person thinking or questioning, with a completion such as: 'A good thinker always asks how do you know' or 'A good thinker always looks carefully before deciding.' The goal is to help children identify themselves as thinkers and connect that identity to a concrete habit.

Marking Notes

Look for a genuine thinking habit in the completion — not just 'a good thinker always thinks.' Discuss: what does this look like in real life? When did you do this today?

Sentence completion
Before I believe something, I ___________. I changed my mind once when ___________.
Skills: Articulating a personal critical thinking habit and connecting it to lived experience
Model Answer

Before I believe something, I ask how do you know and look for more information. I changed my mind once when my friend showed me that the spider was not dangerous after all.

Marking Notes

Accept any response that describes an active thinking step before believing, and any genuine example of changing one's mind. The second completion is especially valuable — children who can recall changing their mind have demonstrated genuine critical thinking in action.

Common Mistakes
Common misconception

Asking 'how do you know?' is rude or shows you do not trust the person.

What to teach instead

Asking for reasons is a sign of respect for the truth — and for the person you are asking. It means you are taking what they say seriously enough to want to understand it better. Good teachers, scientists, and leaders all ask 'how do you know?' It is one of the most important questions a person can ask.

Common misconception

Changing your mind means you were wrong and should feel bad.

What to teach instead

Changing your mind when you learn something new is a sign of good thinking. Scientists, doctors, and leaders change their minds all the time when new evidence appears. The goal is not to be right first time — it is to end up with the best possible answer. People who never change their minds are not stronger thinkers; they are just more stubborn.

Key Ideas at This Level
1 What critical thinking is — and what it is not
2 Asking the right questions: the skill of enquiry
3 Distinguishing facts, opinions, and contested claims
4 Spotting weak arguments: five common fallacies
5 Evaluating sources with the SIFT method
6 Applying critical thinking to everyday information
Teacher Background

Critical thinking is the disciplined habit of questioning, analysing, and evaluating ideas and information — rather than accepting them automatically. It is not cynicism (believing nothing) or contrarianism (disagreeing with everything) but the careful, open-minded search for good reasons to believe something. Critical thinking involves several component skills that can be practised separately and combined.

Asking productive questions

Not just 'what?' but 'why?', 'how do we know?', 'what is the evidence?', 'who says so?', and 'could this be wrong?' Distinguishing facts from opinions: a fact is a claim that can in principle be verified by evidence; an opinion is a claim about value, preference, or interpretation. Many claims fall in between — they are contested, meaning people disagree because the evidence is complex or because values differ. Identifying logical fallacies — common patterns of bad reasoning: attacking the person rather than the argument (ad hominem); believing something because many people do (bandwagon); assuming that because A came before B, A caused B (false cause); misrepresenting someone's argument to make it easier to attack (straw man); and accepting something as true simply because an authority said it (appeal to authority).

Evaluating sources using the SIFT method

Stop before reacting; Investigate the source; Find better coverage; Trace claims to their origin. These skills are relevant in every subject and in everyday life.

Teaching note

Critical thinking is best taught through practice, not through explanation. Use real examples from students' lives wherever possible. Model the habits yourself — think aloud when you evaluate information, and celebrate students who ask good questions.

Key Vocabulary
Claim
A statement that something is true. Claims need to be supported by reasons and evidence before we accept them.
Evidence
Information that supports or challenges a claim — such as data, observations, examples, or expert opinion. Good critical thinkers ask: what is the evidence, and how reliable is it?
Assumption
Something taken for granted in an argument without being stated or proved. Identifying hidden assumptions is a key critical thinking skill.
Opinion
A personal view or interpretation that cannot be proved true or false by evidence alone — though opinions can be better or worse supported by reasons.
Fact
A claim that can in principle be checked against evidence and shown to be true or false. Facts can still be wrong — the key is that they are checkable.
Logical fallacy
A common pattern of bad reasoning — an error in argument that makes a conclusion seem stronger than it really is.
Bias
A tendency to favour one side or conclusion, often without realising it. Bias can affect both the information we look for and how we interpret it.
Counter-argument
A reason or piece of evidence that challenges a claim or argument. Considering counter-arguments is essential to good critical thinking.
Skill-Building Activities
Activity 1 — Fact, opinion, or contested? (classification exercise)
PurposeStudents develop the ability to distinguish between factual claims, opinions, and genuinely contested claims — and understand why the distinction matters.
How to run itPresent ten statements and ask students to classify each as Fact (checkable against evidence), Opinion (personal view or value judgement), or Contested (people disagree because evidence is complex or values differ). Statements to use: (1) 'The Earth is approximately 4.5 billion years old.' (2) 'Football is more exciting than cricket.' (3) 'Climate change is mainly caused by human activity.' (4) 'The government should spend more money on schools.' (5) 'Nelson Mandela was in prison for 27 years.' (6) 'Chocolate is the best flavour of ice cream.' (7) 'Immigration has a positive effect on the economy.' (8) 'The French Revolution began in 1789.' (9) 'Violent video games cause aggressive behaviour.' (10) 'It is wrong to eat meat.' After classifying, discuss: Why does it matter whether something is a fact or an opinion? Can an opinion be well or poorly supported by reasons? What makes statements 3, 7, and 9 genuinely contested rather than simply factual?
💡 Low-resource tipTeacher reads each statement aloud. Students show their classification with hand signals — thumbs up for fact, thumbs down for opinion, flat hand for contested. No materials needed.
Activity 2 — Spotting weak arguments: five common fallacies
PurposeStudents learn to identify five common logical fallacies and practise recognising them in realistic examples.
How to run itTeach the five fallacies with memorable names and one-sentence definitions. Then give one example of each and ask students to identify the fallacy being used. (1) Ad hominem — attacking the person rather than their argument. Example: 'We should not listen to her views on the environment — she drives a big car.' (2) Bandwagon — believing something because many people do. Example: 'Everyone is buying this supplement so it must work.' (3) False cause — assuming that because A came before B, A caused B. Example: 'I wore my lucky socks and we won the match, so the socks caused the victory.' (4) Straw man — misrepresenting someone's argument to make it easier to attack. Example: 'She said we should reduce car use in cities. So she wants to ban all cars and leave people stranded.' (5) Appeal to authority — accepting something as true simply because an expert or famous person said it without checking the evidence. Example: 'A famous actor said this supplement cures cancer, so it must be true.' After identifying each fallacy, ask: How would you respond to this argument? What would a stronger argument look like?
💡 Low-resource tipTeacher describes each fallacy and example verbally. Students identify the fallacy by name. No materials needed. The five names are memorable enough to use without written reference.
Activity 3 — Evaluating information: the SIFT method
PurposeStudents apply a practical four-step method to evaluate whether information they encounter is reliable.
How to run itIntroduce the SIFT method: Stop — pause before sharing or believing something; resist the emotional impulse to react. Investigate the source — who produced this? What do they want? Are they reliable? Find better coverage — is this story covered by other sources? What do they say? Trace claims to their origin — can you find the original data, study, or event the story is based on? Present a realistic scenario: a social media post claims that a new study has found that drinking coffee every day reduces the risk of cancer by 50% and it has been shared thousands of times. Walk students through SIFT: Stop — does this seem too good to be true? Investigate — who published this study and in what journal? Is the 50% figure accurate? Find better coverage — have reputable medical organisations reported this? Trace — what does the original study actually say? Discuss: What questions does SIFT train you to ask? Why is the emotional impulse to share something surprising or pleasing dangerous for information quality?
💡 Low-resource tipTeacher presents the scenario verbally. Students work through SIFT steps in discussion groups. No technology or printed materials needed.
Reflection Questions
  • Q1What is the difference between a fact and an opinion? Can you give an example of each from today's lesson?
  • Q2Have you ever believed something that turned out to be wrong? How did you find out? How did it feel?
  • Q3Can you think of a time when someone used a weak argument to try to persuade you? What made it weak?
  • Q4Why is it important to consider who produced a piece of information before deciding whether to believe it?
  • Q5What is the difference between being a critical thinker and being a cynic who trusts nothing?
  • Q6How might practising critical thinking make you a better student, friend, or citizen?
Practice Tasks
Task 1 — Apply the SIFT method to a real claim
Find ONE claim from a newspaper, social media, or a conversation this week. Write: (a) the claim, (b) who made it and why they might have, (c) what evidence is given, (d) what questions you would ask to evaluate it. Write 4 to 6 sentences.
Skills: Applying the SIFT method and evidence evaluation to a real-world claim
Model Answer

A claim I found was in a social media post: 'Scientists say that teenagers who use social media for more than two hours a day are three times more likely to feel depressed.' The post was shared by a campaign group advocating for restrictions on social media use, which means they have an interest in presenting the strongest possible evidence of harm. The post does not name which scientists or which study the statistic comes from, making it difficult to evaluate. I would ask: What study is this based on and was it peer-reviewed? Does 'more likely' mean social media caused the depression or just that the two are associated? Is the three times figure from the study itself or a journalist's interpretation of it?

Marking Notes

Award marks for: identifying a genuine real-world claim; noting the source and any potential bias; identifying what evidence is given or absent; asking at least two specific evaluative questions. Strong answers will distinguish between correlation and causation, or between an original study and a report about it. Accept any well-chosen real-world claim.

Task 2 — Identify and correct a fallacy
Choose ONE of the five fallacies from the lesson. Write: (a) the name of the fallacy and what it means, (b) a new example of it from real life or the news, (c) why this is a weak argument, (d) what a stronger version of the argument would look like. Write 4 to 6 sentences.
Skills: Recognising and naming a logical fallacy, generating an original example, and constructing a stronger argument
Common Mistakes
Common misconception

Critical thinking means finding fault with everything.

What to teach instead

Critical thinking is about evaluating ideas carefully — which sometimes means accepting them, not rejecting them. A critical thinker who looks at an argument and finds it well-supported by evidence accepts the conclusion. Critical thinking is not contrarianism. Its goal is to end up with the most accurate and best-supported beliefs — which sometimes means changing your mind towards what you initially rejected.

Common misconception

A strong argument is one that uses lots of facts and statistics.

What to teach instead

Facts and statistics can support good arguments but can also be misused — taken out of context, cherry-picked, or misrepresented. A strong argument is one where the evidence genuinely supports the conclusion, where assumptions are stated, where counter-arguments are considered, and where the reasoning is valid. More data does not automatically mean better reasoning.

Common misconception

If something appears on the internet with lots of sources, it must be true.

What to teach instead

Quantity of sources is not the same as quality. Multiple websites can repeat the same false claim, each appearing to confirm the others — a phenomenon called circular reporting. What matters is whether the sources are independent, whether they are reliable, whether they cite original evidence, and whether that evidence actually supports the claim.

Common misconception

Critical thinking is a natural ability — either you have it or you do not.

What to teach instead

Critical thinking is a set of learnable skills. Like any skill, it improves with deliberate practice. Research shows that explicit instruction in critical thinking habits — asking specific questions, identifying fallacies, evaluating sources — produces measurable improvements in reasoning ability. No one is born a critical thinker; everyone can become one.

Key Ideas at This Level
1 The structure of arguments — premises, conclusions, validity, and soundness
2 Deductive and inductive reasoning
3 Evaluating evidence — types, reliability, and strength
4 Cognitive biases — why smart people think badly
5 Epistemic humility — knowing what you do not know
6 Steelmanning — engaging with the strongest version of an opposing argument
7 Motivated reasoning and the limits of rational argument
8 Applying critical thinking to complex real-world questions
Teacher Background

Critical thinking at secondary level requires engagement with the formal structure of arguments and with the psychological barriers to good thinking. The structure of arguments: a valid argument has premises (the reasons offered in support) and a conclusion (the claim being argued for). An argument is valid if the conclusion follows logically from the premises. It is sound if it is valid AND the premises are true. Many arguments are valid but unsound (the reasoning is correct but the premises are false); others are invalid (the conclusion does not follow even if the premises are true). Teaching students to separate these questions — 'Does the conclusion follow from the premises?' and 'Are the premises actually true?' — is one of the most valuable reasoning skills. Deductive reasoning moves from general principles to specific conclusions: if all premises are true and the argument is valid, the conclusion must be true. Inductive reasoning moves from specific observations to general conclusions: it can produce very strong support for a conclusion, but never certainty. Most reasoning in everyday life and science is inductive.

Evaluating evidence

Not all evidence is equally strong. A hierarchy of evidence exists in medicine and social science — randomised controlled trials are stronger than case studies; peer-reviewed meta-analyses are stronger than single studies; systematic reviews are stronger than expert opinion. Cognitive biases are systematic errors in human reasoning.

Key biases

Confirmation bias (seeking evidence that confirms existing beliefs); availability heuristic (overestimating the likelihood of things that come easily to mind); anchoring (over-relying on the first piece of information encountered); in-group bias (treating evidence more favourably when it supports one's own group); and the Dunning-Kruger effect (people with limited knowledge tend to overestimate their competence). These biases affect experts as well as novices and often cannot be overcome simply by knowing about them. Steelmanning — deliberately constructing the strongest possible version of the argument you disagree with — is one of the most powerful intellectual practices. It ensures genuine engagement with opposing views. Epistemic humility — knowing the limits of what you know and being willing to say 'I do not know' — is a key intellectual virtue.

Motivated reasoning

Research shows that people often become more entrenched in false beliefs when confronted with contradicting evidence. Understanding when and why this happens is an important part of applied critical thinking.

Key Vocabulary
Valid argument
An argument in which the conclusion follows logically from the premises — if the premises are true, the conclusion must be true. Validity concerns the structure of the argument, not whether the premises are actually true.
Sound argument
An argument that is both valid AND has true premises. A sound argument guarantees a true conclusion.
Inductive reasoning
Reasoning from specific observations to a general conclusion — producing strong support but not certainty. Most scientific reasoning and everyday thinking is inductive.
Deductive reasoning
Reasoning from general principles to specific conclusions — if the premises are true and the argument is valid, the conclusion is certain. Mathematics and formal logic use deductive reasoning.
Confirmation bias
The tendency to seek, notice, and interpret information in ways that confirm what we already believe — one of the most pervasive and powerful cognitive biases.
Steelmanning
Deliberately constructing the strongest possible version of an opposing argument before responding to it — the opposite of straw-manning. A mark of intellectual honesty and rigour.
Epistemic humility
Awareness of the limits of one's own knowledge and the genuine possibility of being wrong — combined with openness to learning and updating one's views.
Motivated reasoning
The tendency to reason towards conclusions that are emotionally or socially convenient rather than conclusions that the evidence actually supports — often without being aware of it.
Cognitive bias
A systematic pattern of error in human thinking — a reliable way in which our judgements deviate from what careful reasoning would produce. Biases affect everyone, including experts.
Burden of proof
The responsibility to provide evidence for a claim. The burden of proof lies with the person making a positive claim — not with those who doubt it.
Skill-Building Activities
Activity 1 — Argument mapping: pulling arguments apart
PurposeStudents learn to identify premises and conclusions, distinguish valid from invalid arguments, and separate the question of validity from whether premises are actually true.
How to run itIntroduce the premise-conclusion structure with a simple example: Premise 1 — all humans are mortal. Premise 2 — Socrates is a human. Conclusion — therefore, Socrates is mortal. This is valid and sound. Now present four arguments for students to analyse — identifying premises, conclusion, whether the argument is valid, and whether the premises are actually true. Argument A: 'Screen time causes depression in teenagers. My son spends a lot of time on screens. Therefore he will become depressed.' Argument B: 'No country that has adopted Policy X has seen crime fall. This country is considering Policy X. Therefore adopting it will not reduce crime.' Argument C: 'This politician has lied before. Therefore everything she says is a lie.' Argument D: 'Most economists support free trade. Therefore free trade is correct policy.' For each argument ask: What are the premises? What is the conclusion? Does the conclusion follow from the premises? Are the premises actually true — and how confident can we be? After analysis, discuss: Which errors are about validity and which are about the truth of the premises? Why is it important to keep these questions separate?
💡 Low-resource tipTeacher presents each argument verbally. Students discuss in groups and share findings. No materials needed.
Activity 2 — Cognitive biases in action: a self-diagnostic
PurposeStudents encounter and experience key cognitive biases directly — understanding not just what they are but why they feel convincing even when we know better.
How to run itRun students through four short exercises that demonstrate biases through direct experience. Exercise 1 — Confirmation bias: ask students to spend 30 seconds thinking of evidence that supports a claim they already believe (for example, 'social media is bad for young people'). Then spend 30 seconds thinking of evidence against it. Ask: which was easier and why? Exercise 2 — Availability heuristic: ask whether there are more words in English that start with the letter K, or more words with K as the third letter. Most people say 'start with K' because examples come to mind more easily, but the answer is the third letter. What does this reveal about how we assess probability? Exercise 3 — Anchoring: tell students a number and ask whether the population of Tanzania is more or less than that number, then ask them to estimate the actual figure. Compare results with students who heard a very different anchor. Exercise 4 — In-group bias: describe how the same piece of scientific evidence is evaluated differently when people are told it supports their side versus the other side. After all four exercises, ask: What do these results tell us about the reliability of our own judgements? What can we do to compensate for biases we cannot simply eliminate?
💡 Low-resource tipAll four exercises can be run verbally and through discussion. No materials needed. The exercises are designed to be experiential — students discover the biases through their own responses rather than just being told about them.
Activity 3 — Steelmanning: arguing for what you disagree with
PurposeStudents practise the advanced skill of constructing the strongest possible version of an argument they personally disagree with.
How to run itExplain steelmanning: most people argue against the weakest version of opposing views — this is called straw-manning. Steelmanning means deliberately finding the best version of the argument — the one that a thoughtful, intelligent, well-informed person on that side would actually make. This is harder, more honest, and more intellectually useful. Assign each student a position they personally disagree with. Give ten minutes to construct the strongest possible case for it — using the best evidence, the most sophisticated reasoning, and the most defensible version of the underlying values. Suitable positions include: 'Social media should be banned for under-16s.' / 'Wealthy countries should accept unlimited refugees.' / 'Animal farming should be made illegal.' / 'University education should be free for everyone.' After preparation, students present their steelman argument. The group's task is not to refute it but to ask: Is this a genuine steelman? Are there even stronger arguments on this side? What does engaging seriously with this argument teach us about our own position? Debrief: How did it feel to argue for something you disagree with? Did it change your view at all? What is the difference between understanding an argument and agreeing with it?
💡 Low-resource tipStudents prepare verbally and present verbally. No materials needed. This activity works best when positions are genuinely contested — not obviously wrong.
Reflection Questions
  • Q1What is the difference between a valid argument and a sound argument? Why does this distinction matter in practice?
  • Q2Confirmation bias means we tend to find evidence that confirms what we already believe. If this is a universal human tendency, what can we actually do about it?
  • Q3Research on motivated reasoning suggests that confronting people with evidence that contradicts their beliefs can sometimes make them believe the false thing more strongly. What does this tell us about the limits of rational argument?
  • Q4What is the difference between steelmanning and simply agreeing with the other side? Why is steelmanning more valuable than identifying weak counter-arguments?
  • Q5Is it possible to be a genuinely good critical thinker and still hold strong personal values and commitments? Or does critical thinking inevitably lead to relativism?
  • Q6Which cognitive bias do you think has had the most influence on your own thinking? How would you know?
  • Q7Some argue that critical thinking is itself a culturally specific value associated with Western rationalist tradition, and that other ways of knowing are equally valid. How would you respond to this argument?
Practice Tasks
Task 1 — Steelman both sides of a contested question
Choose a contested claim from the news or your studies. (a) State the claim clearly. (b) Steelman the strongest case FOR it. (c) Steelman the strongest case AGAINST it. (d) Evaluate the evidence on both sides. (e) State your conclusion and what would change your mind. Write 400 to 600 words.
Skills: Steelmanning, evidence evaluation, epistemic humility, and structured argument applied to a real contested question
Model Answer

Claim: Social media use is causing a mental health crisis among teenagers.

The strongest case for this claim begins with timing. Rates of depression, anxiety, and self-harm among teenagers — particularly girls — began rising sharply in the early 2010s in the US, UK, Australia, and other wealthy countries. This is exactly when smartphone adoption reached mass levels. Researchers Jean Twenge and Jonathan Haidt argue this correlation is too strong and too consistent across countries to be coincidental. The proposed mechanism is plausible: social media exposes teenagers to constant social comparison, cyberbullying, and curated images of idealised lives. Experimental studies have found that reducing social media use improves wellbeing. Internal documents from Meta, leaked by Frances Haugen in 2021, showed the company's own research found Instagram worsened body image for teenage girls — knowledge the company reportedly suppressed.

The strongest case against is methodological. The correlation between smartphone adoption and mental health decline is real, but correlation does not establish causation. Researchers Amy Orben and Andrew Przybylski conducted large-scale analyses and found the effect size of social media on wellbeing to be tiny — comparable to the effect of wearing glasses or eating potatoes. The mental health decline may have other causes that coincide with the smartphone era: economic insecurity, academic pressure, or climate anxiety. Research consistently shows most teenagers experience social media positively — for connection, self-expression, and finding communities, particularly for LGBTQ+ youth isolated in offline environments.

Evaluating the evidence: the disagreement is partly empirical and partly methodological. Both camps rely on legitimate research approaches with genuine limitations. The leaked Meta documents are significant because they show the company knew of potential harms and concealed them — which shifts the moral question even if the scientific question remains open.

My conclusion: the evidence is genuinely uncertain but not symmetrical. The plausible mechanisms of harm are well-documented, the companies involved had strong incentives to suppress negative findings, and the precautionary argument has real force. I would support stronger regulation of platform design features that optimise for engagement time in young users, while being sceptical of blanket age-based bans. What would change my mind: high-quality randomised controlled trials showing no effect of social media on mental health in adolescents, or compelling evidence that the decline has a different primary cause.

Marking Notes

Award full marks for: a clearly stated contested claim; a genuine steelman of both sides — not straw men; evidence cited with attention to its quality and limitations; a reasoned conclusion acknowledging uncertainty; an explicit statement of what would change the writer's mind. Deduct marks for responses that list arguments on both sides without evaluating them, or that conclude with certainty on a genuinely contested question without acknowledging the limits of evidence.

Task 2 — Bias audit
Choose ONE cognitive bias from the lesson. (a) Define it accurately. (b) Give a real example of it affecting public debate or decision-making. (c) Explain what someone should do differently to avoid or compensate for it. Write 200 to 300 words.
Skills: Defining a cognitive bias precisely, applying it to a real-world case, generating a practical and evidence-based remedy
Model Answer

Confirmation bias is the tendency to search for, notice, and interpret information in ways that confirm our existing beliefs — while giving less attention to information that challenges them. It is one of the most pervasive and well-documented cognitive biases, affecting experts and novices alike.

A clear example appeared during the COVID-19 pandemic. People who believed lockdowns were ineffective tended to highlight every study suggesting limited effectiveness, share anecdotes of rule-breaking, and interpret ambiguous data as confirming their view — while rationalising away evidence of lives saved. People who strongly believed lockdowns were effective showed the same pattern in the opposite direction. Both groups felt they were simply following the evidence, while in fact their interpretation was shaped by what they had already concluded. Political identity amplified the effect, making it harder for either group to update their views as new research appeared.

To compensate for confirmation bias, a person should deliberately seek out the strongest evidence against their current view — not just evidence for it. This means reading sources you typically disagree with, asking 'what would change my mind?', and treating surprising or uncomfortable evidence as worth examining rather than dismissing. Keeping a record of predictions and checking them against outcomes is another practical tool. Crucially, awareness alone is not enough — research consistently shows that knowing about confirmation bias does not make people significantly less susceptible to it. Structural remedies are more effective: seeking out a genuine devil's advocate, pre-specifying criteria for evaluating evidence before seeing results, or using formal decision protocols.

Marking Notes

Award marks for: an accurate definition capturing both the information-seeking and interpretation dimensions; a specific real-world example with a clear mechanism; at least two practical remedies with some indication of their effectiveness. Strong answers will note that awareness alone is insufficient and that structural approaches are needed.

Common Mistakes
Common misconception

Critical thinking means being sceptical of everything and trusting nothing.

What to teach instead

Critical thinking aims at accurate beliefs — which sometimes means accepting claims, not always doubting them. A critical thinker who evaluates strong evidence for a well-supported claim accepts it. Wholesale scepticism — refusing to believe anything — is itself an irrational position, because some claims are much better supported than others. The goal is calibrated confidence: strong confidence in well-supported claims, genuine uncertainty about genuinely uncertain ones, and refusal to accept poorly supported ones.

Common misconception

Once you know about cognitive biases, you can overcome them through willpower and awareness.

What to teach instead

Research consistently shows that knowing about cognitive biases does not make people significantly less susceptible to them. Confirmation bias persists even in people who understand it well and are trying to avoid it. Biases are often built into perceptual and processing systems below the level of conscious control. More effective strategies are structural — pre-committing to decision criteria before seeing results, seeking genuine devil's advocates, and using formal protocols for evidence evaluation. Epistemic humility — assuming you are probably biased in ways you cannot see — is more protective than confidence in your ability to self-correct.

Common misconception

A more intelligent person is automatically a better critical thinker.

What to teach instead

Intelligence and critical thinking are related but distinct. Research by Keith Stanovich and others shows that cognitive ability does not strongly predict reasoning quality on tasks involving biases and fallacies. Higher intelligence can sometimes make people better at rationalising conclusions they arrived at through bias — constructing more sophisticated justifications for motivated conclusions. This phenomenon is called myside bias or motivated reasoning, and it is not prevented by high intelligence. Critical thinking skills must be explicitly learned and practised.

Common misconception

If you can argue both sides of an issue equally well, you have fully understood it.

What to teach instead

Being able to generate arguments on both sides is useful but is not the same as understanding an issue. Critical thinking requires evaluating arguments — assessing the quality of evidence, the validity of reasoning, and the strength of competing considerations — not just listing them. A student who can list three arguments for and three against but cannot say which are stronger or what evidence would resolve the question has not fully applied critical thinking. The goal is reasoned judgement, not balanced lists.

Further Practice & Resources

Key texts and resources: Daniel Kahneman, 'Thinking, Fast and Slow' (2011) — the most accessible account of cognitive biases and dual-process theory; Chapters 1 to 3 and 11 to 14 are most relevant. Keith Stanovich, 'What Intelligence Tests Miss' (2009) — on the distinction between intelligence and rational thinking. Carl Sagan's Baloney Detection Kit from 'The Demon-Haunted World' (1995) — a classic short guide to critical thinking tools, freely available online. The SIFT method is documented at checkplease.cc. For cognitive biases: the Decision Lab (thedecisionlab.com) provides accessible summaries. For teaching critical thinking: the Foundation for Critical Thinking (criticalthinking.org) publishes free classroom resources. For research on motivated reasoning: Brendan Nyhan and Jason Reifler's work on belief updating is available through academic databases. Rolf Dobelli's 'The Art of Thinking Clearly' provides an accessible list of 99 cognitive biases with examples.