How to question, analyse, and evaluate ideas and information carefully rather than accepting them at face value — a practical skill for every classroom and every subject.
Critical thinking at Early Years level is about building the habit of pausing and questioning before accepting something as true. Children are naturally curious — the goal is to keep that curiosity alive and connect it to a few simple habits: look carefully, ask why, think about whether it makes sense, and hear other ideas before deciding. At this stage, critical thinking is taught through concrete, playful activities rather than abstract principles. Children do not need the phrase 'critical thinking' — they need to practise it. The most powerful teaching move is to model the habit yourself: say 'that is interesting — how do we know?' and 'let us look carefully before we decide.' In low-resource classrooms, all these activities work through talk and observation alone — no materials are needed. Avoid teaching children to be cynical or to distrust everything. The goal is thoughtful, curious engagement — not scepticism for its own sake.
Any drawing of a person thinking or questioning, with a completion such as: 'A good thinker always asks how do you know' or 'A good thinker always looks carefully before deciding.' The goal is to help children identify themselves as thinkers and connect that identity to a concrete habit.
Look for a genuine thinking habit in the completion — not just 'a good thinker always thinks.' Discuss: what does this look like in real life? When did you do this today?
Before I believe something, I ask how do you know and look for more information. I changed my mind once when my friend showed me that the spider was not dangerous after all.
Accept any response that describes an active thinking step before believing, and any genuine example of changing one's mind. The second completion is especially valuable — children who can recall changing their mind have demonstrated genuine critical thinking in action.
Asking 'how do you know?' is rude or shows you do not trust the person.
Asking for reasons is a sign of respect for the truth — and for the person you are asking. It means you are taking what they say seriously enough to want to understand it better. Good teachers, scientists, and leaders all ask 'how do you know?' It is one of the most important questions a person can ask.
Changing your mind means you were wrong and should feel bad.
Changing your mind when you learn something new is a sign of good thinking. Scientists, doctors, and leaders change their minds all the time when new evidence appears. The goal is not to be right first time — it is to end up with the best possible answer. People who never change their minds are not stronger thinkers; they are just more stubborn.
Critical thinking is the disciplined habit of questioning, analysing, and evaluating ideas and information — rather than accepting them automatically. It is not cynicism (believing nothing) or contrarianism (disagreeing with everything) but the careful, open-minded search for good reasons to believe something. Critical thinking involves several component skills that can be practised separately and combined.
Not just 'what?' but 'why?', 'how do we know?', 'what is the evidence?', 'who says so?', and 'could this be wrong?' Distinguishing facts from opinions: a fact is a claim that can in principle be verified by evidence; an opinion is a claim about value, preference, or interpretation. Many claims fall in between — they are contested, meaning people disagree because the evidence is complex or because values differ. Identifying logical fallacies — common patterns of bad reasoning: attacking the person rather than the argument (ad hominem); believing something because many people do (bandwagon); assuming that because A came before B, A caused B (false cause); misrepresenting someone's argument to make it easier to attack (straw man); and accepting something as true simply because an authority said it (appeal to authority).
Stop before reacting; Investigate the source; Find better coverage; Trace claims to their origin. These skills are relevant in every subject and in everyday life.
Critical thinking is best taught through practice, not through explanation. Use real examples from students' lives wherever possible. Model the habits yourself — think aloud when you evaluate information, and celebrate students who ask good questions.
A claim I found was in a social media post: 'Scientists say that teenagers who use social media for more than two hours a day are three times more likely to feel depressed.' The post was shared by a campaign group advocating for restrictions on social media use, which means they have an interest in presenting the strongest possible evidence of harm. The post does not name which scientists or which study the statistic comes from, making it difficult to evaluate. I would ask: What study is this based on and was it peer-reviewed? Does 'more likely' mean social media caused the depression or just that the two are associated? Is the three times figure from the study itself or a journalist's interpretation of it?
Award marks for: identifying a genuine real-world claim; noting the source and any potential bias; identifying what evidence is given or absent; asking at least two specific evaluative questions. Strong answers will distinguish between correlation and causation, or between an original study and a report about it. Accept any well-chosen real-world claim.
Critical thinking means finding fault with everything.
Critical thinking is about evaluating ideas carefully — which sometimes means accepting them, not rejecting them. A critical thinker who looks at an argument and finds it well-supported by evidence accepts the conclusion. Critical thinking is not contrarianism. Its goal is to end up with the most accurate and best-supported beliefs — which sometimes means changing your mind towards what you initially rejected.
A strong argument is one that uses lots of facts and statistics.
Facts and statistics can support good arguments but can also be misused — taken out of context, cherry-picked, or misrepresented. A strong argument is one where the evidence genuinely supports the conclusion, where assumptions are stated, where counter-arguments are considered, and where the reasoning is valid. More data does not automatically mean better reasoning.
If something appears on the internet with lots of sources, it must be true.
Quantity of sources is not the same as quality. Multiple websites can repeat the same false claim, each appearing to confirm the others — a phenomenon called circular reporting. What matters is whether the sources are independent, whether they are reliable, whether they cite original evidence, and whether that evidence actually supports the claim.
Critical thinking is a natural ability — either you have it or you do not.
Critical thinking is a set of learnable skills. Like any skill, it improves with deliberate practice. Research shows that explicit instruction in critical thinking habits — asking specific questions, identifying fallacies, evaluating sources — produces measurable improvements in reasoning ability. No one is born a critical thinker; everyone can become one.
Critical thinking at secondary level requires engagement with the formal structure of arguments and with the psychological barriers to good thinking. The structure of arguments: a valid argument has premises (the reasons offered in support) and a conclusion (the claim being argued for). An argument is valid if the conclusion follows logically from the premises. It is sound if it is valid AND the premises are true. Many arguments are valid but unsound (the reasoning is correct but the premises are false); others are invalid (the conclusion does not follow even if the premises are true). Teaching students to separate these questions — 'Does the conclusion follow from the premises?' and 'Are the premises actually true?' — is one of the most valuable reasoning skills. Deductive reasoning moves from general principles to specific conclusions: if all premises are true and the argument is valid, the conclusion must be true. Inductive reasoning moves from specific observations to general conclusions: it can produce very strong support for a conclusion, but never certainty. Most reasoning in everyday life and science is inductive.
Not all evidence is equally strong. A hierarchy of evidence exists in medicine and social science — randomised controlled trials are stronger than case studies; peer-reviewed meta-analyses are stronger than single studies; systematic reviews are stronger than expert opinion. Cognitive biases are systematic errors in human reasoning.
Confirmation bias (seeking evidence that confirms existing beliefs); availability heuristic (overestimating the likelihood of things that come easily to mind); anchoring (over-relying on the first piece of information encountered); in-group bias (treating evidence more favourably when it supports one's own group); and the Dunning-Kruger effect (people with limited knowledge tend to overestimate their competence). These biases affect experts as well as novices and often cannot be overcome simply by knowing about them. Steelmanning — deliberately constructing the strongest possible version of the argument you disagree with — is one of the most powerful intellectual practices. It ensures genuine engagement with opposing views. Epistemic humility — knowing the limits of what you know and being willing to say 'I do not know' — is a key intellectual virtue.
Research shows that people often become more entrenched in false beliefs when confronted with contradicting evidence. Understanding when and why this happens is an important part of applied critical thinking.
Claim: Social media use is causing a mental health crisis among teenagers.
The strongest case for this claim begins with timing. Rates of depression, anxiety, and self-harm among teenagers — particularly girls — began rising sharply in the early 2010s in the US, UK, Australia, and other wealthy countries. This is exactly when smartphone adoption reached mass levels. Researchers Jean Twenge and Jonathan Haidt argue this correlation is too strong and too consistent across countries to be coincidental. The proposed mechanism is plausible: social media exposes teenagers to constant social comparison, cyberbullying, and curated images of idealised lives. Experimental studies have found that reducing social media use improves wellbeing. Internal documents from Meta, leaked by Frances Haugen in 2021, showed the company's own research found Instagram worsened body image for teenage girls — knowledge the company reportedly suppressed.
The strongest case against is methodological. The correlation between smartphone adoption and mental health decline is real, but correlation does not establish causation. Researchers Amy Orben and Andrew Przybylski conducted large-scale analyses and found the effect size of social media on wellbeing to be tiny — comparable to the effect of wearing glasses or eating potatoes. The mental health decline may have other causes that coincide with the smartphone era: economic insecurity, academic pressure, or climate anxiety. Research consistently shows most teenagers experience social media positively — for connection, self-expression, and finding communities, particularly for LGBTQ+ youth isolated in offline environments.
Evaluating the evidence: the disagreement is partly empirical and partly methodological. Both camps rely on legitimate research approaches with genuine limitations. The leaked Meta documents are significant because they show the company knew of potential harms and concealed them — which shifts the moral question even if the scientific question remains open.
My conclusion: the evidence is genuinely uncertain but not symmetrical. The plausible mechanisms of harm are well-documented, the companies involved had strong incentives to suppress negative findings, and the precautionary argument has real force. I would support stronger regulation of platform design features that optimise for engagement time in young users, while being sceptical of blanket age-based bans. What would change my mind: high-quality randomised controlled trials showing no effect of social media on mental health in adolescents, or compelling evidence that the decline has a different primary cause.
Award full marks for: a clearly stated contested claim; a genuine steelman of both sides — not straw men; evidence cited with attention to its quality and limitations; a reasoned conclusion acknowledging uncertainty; an explicit statement of what would change the writer's mind. Deduct marks for responses that list arguments on both sides without evaluating them, or that conclude with certainty on a genuinely contested question without acknowledging the limits of evidence.
Confirmation bias is the tendency to search for, notice, and interpret information in ways that confirm our existing beliefs — while giving less attention to information that challenges them. It is one of the most pervasive and well-documented cognitive biases, affecting experts and novices alike.
A clear example appeared during the COVID-19 pandemic. People who believed lockdowns were ineffective tended to highlight every study suggesting limited effectiveness, share anecdotes of rule-breaking, and interpret ambiguous data as confirming their view — while rationalising away evidence of lives saved. People who strongly believed lockdowns were effective showed the same pattern in the opposite direction. Both groups felt they were simply following the evidence, while in fact their interpretation was shaped by what they had already concluded. Political identity amplified the effect, making it harder for either group to update their views as new research appeared.
To compensate for confirmation bias, a person should deliberately seek out the strongest evidence against their current view — not just evidence for it. This means reading sources you typically disagree with, asking 'what would change my mind?', and treating surprising or uncomfortable evidence as worth examining rather than dismissing. Keeping a record of predictions and checking them against outcomes is another practical tool. Crucially, awareness alone is not enough — research consistently shows that knowing about confirmation bias does not make people significantly less susceptible to it. Structural remedies are more effective: seeking out a genuine devil's advocate, pre-specifying criteria for evaluating evidence before seeing results, or using formal decision protocols.
Award marks for: an accurate definition capturing both the information-seeking and interpretation dimensions; a specific real-world example with a clear mechanism; at least two practical remedies with some indication of their effectiveness. Strong answers will note that awareness alone is insufficient and that structural approaches are needed.
Critical thinking means being sceptical of everything and trusting nothing.
Critical thinking aims at accurate beliefs — which sometimes means accepting claims, not always doubting them. A critical thinker who evaluates strong evidence for a well-supported claim accepts it. Wholesale scepticism — refusing to believe anything — is itself an irrational position, because some claims are much better supported than others. The goal is calibrated confidence: strong confidence in well-supported claims, genuine uncertainty about genuinely uncertain ones, and refusal to accept poorly supported ones.
Once you know about cognitive biases, you can overcome them through willpower and awareness.
Research consistently shows that knowing about cognitive biases does not make people significantly less susceptible to them. Confirmation bias persists even in people who understand it well and are trying to avoid it. Biases are often built into perceptual and processing systems below the level of conscious control. More effective strategies are structural — pre-committing to decision criteria before seeing results, seeking genuine devil's advocates, and using formal protocols for evidence evaluation. Epistemic humility — assuming you are probably biased in ways you cannot see — is more protective than confidence in your ability to self-correct.
A more intelligent person is automatically a better critical thinker.
Intelligence and critical thinking are related but distinct. Research by Keith Stanovich and others shows that cognitive ability does not strongly predict reasoning quality on tasks involving biases and fallacies. Higher intelligence can sometimes make people better at rationalising conclusions they arrived at through bias — constructing more sophisticated justifications for motivated conclusions. This phenomenon is called myside bias or motivated reasoning, and it is not prevented by high intelligence. Critical thinking skills must be explicitly learned and practised.
If you can argue both sides of an issue equally well, you have fully understood it.
Being able to generate arguments on both sides is useful but is not the same as understanding an issue. Critical thinking requires evaluating arguments — assessing the quality of evidence, the validity of reasoning, and the strength of competing considerations — not just listing them. A student who can list three arguments for and three against but cannot say which are stronger or what evidence would resolve the question has not fully applied critical thinking. The goal is reasoned judgement, not balanced lists.
Key texts and resources: Daniel Kahneman, 'Thinking, Fast and Slow' (2011) — the most accessible account of cognitive biases and dual-process theory; Chapters 1 to 3 and 11 to 14 are most relevant. Keith Stanovich, 'What Intelligence Tests Miss' (2009) — on the distinction between intelligence and rational thinking. Carl Sagan's Baloney Detection Kit from 'The Demon-Haunted World' (1995) — a classic short guide to critical thinking tools, freely available online. The SIFT method is documented at checkplease.cc. For cognitive biases: the Decision Lab (thedecisionlab.com) provides accessible summaries. For teaching critical thinking: the Foundation for Critical Thinking (criticalthinking.org) publishes free classroom resources. For research on motivated reasoning: Brendan Nyhan and Jason Reifler's work on belief updating is available through academic databases. Rolf Dobelli's 'The Art of Thinking Clearly' provides an accessible list of 99 cognitive biases with examples.
Your feedback helps other teachers and helps us improve TeachAnyClass.