How to make good choices — especially when information is incomplete, when values are in tension, and when the stakes matter. Decision making is not about finding the perfect answer. It is about thinking clearly about what you know, what you value, what the consequences might be, and then acting with appropriate confidence and appropriate humility.
Decision making at Early Years level is about building the foundational habits of deliberate choice — pausing before acting, thinking about consequences, considering others, and being willing to revise. Young children are developmentally in the process of building executive function — the cognitive capacities that enable self-control, working memory, and flexible thinking — and this development is directly supported by environments that require and reward thoughtful rather than impulsive choice. The most important classroom practice at this level is to slow down the space before action: to create moments where children are explicitly asked what might happen if you do that? before they act, and what did you think would happen? after. This does not mean withholding action — it means building the metacognitive habit of checking in with thinking before reacting. The connection between decision making and emotions is particularly important at this age: strong emotions typically accelerate decision making and reduce its quality. Helping children learn to notice emotional states (I am feeling very angry right now) before acting on them (and that is making me want to grab the toy) is one of the most valuable things early childhood education can do. In communities where children have significant responsibility — for younger siblings, for household tasks, for contributing to family income — decision-making education is not abstract but immediately practical. These children are already making real decisions with real consequences and deserve frameworks for thinking about them.
A drawing showing a fork in a path, with each path labelled with a different option and showing where it leads. The completion names the specific decision, the specific alternatives, the choice made, and a because that explains the reasoning.
Ask: did the decision affect only you or other people too? What did those other people think? The second question reveals whether the child is beginning to include others' perspectives in their decision-making.
A decision I made that turned out well was deciding to tell a teacher when I saw my friend being bullied, even though I was worried my friend would be angry with me. I made it by thinking about what would happen to my friend if I did not tell anyone, and deciding that their safety was more important than avoiding an argument. A decision I made that I would do differently is getting angry and saying something mean when a classmate took my place in line, because now I know that getting angry made a small problem into a bigger one and I could have just asked calmly.
The second reflection is more important than the first — learning from a decision that went poorly requires genuine honesty and produces more growth than celebrating a good decision. Celebrate honest self-reflection rather than ideally presented decisions.
If a decision turns out badly, it was a bad decision.
The quality of a decision should be judged by the quality of the process that produced it, not only by the outcome. A well-reasoned decision with good information can still have a bad outcome due to factors that could not have been anticipated. A poorly reasoned, impulsive decision can sometimes lead to a good outcome by luck. Outcome and process are different things. Learning to evaluate decision processes — was the information good? were alternatives considered? were other people included? — rather than only outcomes is one of the most important habits of sound decision making.
Adults always make better decisions than children.
Adults generally have more experience, more information, and more developed cognitive capacities for reasoning about complex situations. For this reason, children should usually take adult guidance seriously on important decisions. But adults also make poor decisions — through bias, insufficient information, emotional reactivity, and self-interest. The quality of a decision depends on the quality of the process, not the age of the decision-maker. Children can and do make good decisions, and learning to reason well about choices is something that improves with practice at every age.
The right decision is always the one that makes you happy right now.
Immediate happiness and long-term wellbeing are often different — and good decision making requires taking both into account. Eating all the food now might feel satisfying but leave nothing for later. Telling a difficult truth might feel uncomfortable now but build trust over time. The research on decision making consistently shows that people systematically underweight future consequences relative to immediate ones — a tendency called present bias. Learning to ask not just what will this feel like right now but what will this look like in a week, a month, a year is one of the most valuable decision-making habits there is.
Decision making at primary level introduces students to a structured approach to significant decisions — one that can be applied deliberately and that improves over time with practice. The structured approach: a practical decision-making framework suitable for primary students has four stages. Clarify the decision — what exactly am I choosing, when does it need to be made, and what are the constraints? Identify options — what are the realistic choices available? Evaluate options — what are the likely consequences of each, who is affected and how, what do my values say about each? Decide and review — choose based on the evaluation, implement the choice, and review what happened.
Most significant decisions are not purely factual — they involve choices between things that are valued differently. A student deciding whether to report a friend who has done something wrong is not choosing between outcomes (report or don't report) but between values (loyalty to a friend versus honesty and accountability). Helping students identify the values at stake in a decision is one of the most important parts of decision-making education.
Most real-world decisions are made under uncertainty — we cannot know the outcomes with certainty before choosing. Teaching students to think probabilistically — to estimate how likely different outcomes are and to weigh them against their importance — is a key decision-making skill. The concept of expected value (roughly, probability times magnitude of outcome, summed across all possible outcomes) is accessible at primary level in a simplified form: how likely is this? How much does it matter?
Present bias (overweighting immediate outcomes), sunk cost fallacy (continuing a course of action because of past investment), confirmation bias (seeking only information that supports the preferred option), and anchoring (being overly influenced by the first piece of information received) all affect decision quality. At primary level, the most practically important is present bias — the tendency to choose options with better immediate but worse long-term consequences.
Most significant decisions are made by or with others. The conditions that make group decisions better than individual ones — diversity, genuine participation, explicit disagreement — and those that make them worse — groupthink, status effects, cascade — are important for any collaborative context.
The decision I face is whether to take on a part-time job selling water at the market in the afternoons, which would give my family extra income but would reduce my time for schoolwork and rest. The realistic options are: take the job; decline it; negotiate to work only on weekends; or ask whether my younger sibling could share the work. The criteria that matter most to me are: impact on my family's income (high weight), impact on my studies (high weight), my own wellbeing and rest (medium weight), and fairness to my sibling (medium weight). Evaluating the options: taking the job full-time scores well on income but poorly on studies and rest; declining scores well on studies but poorly on income; working weekends only scores reasonably on both; sharing with my sibling might work but depends on whether they are willing and able. My decision is to propose the weekend-only arrangement first, because it partially meets the income need while protecting the most important study time. If that is not acceptable to the employer, I will consider the shared arrangement before accepting the full-time schedule.
Award marks for: a specific and real decision rather than a hypothetical one; a complete list of options that includes creative alternatives not just the obvious two; criteria that reflect genuine values rather than just practical factors; an evaluation that takes all criteria seriously; and a decision with reasoning that is consistent with the evaluation. Strong answers will acknowledge trade-offs — that the chosen option does not score best on every criterion — and will explain why the prioritisation they used was appropriate.
The ethical dilemma I am describing is faced by a nurse at a small rural health clinic who discovers that a colleague — the only other health worker for fifty kilometres — has been taking medicines from the clinic's limited supply for personal use. The values in tension are: honesty and accountability (the theft should be reported) versus community welfare (reporting the colleague might mean the clinic closes or operates with only one health worker, leaving the community even more vulnerable). The strongest argument for reporting is that the missing medicines could harm patients, the behaviour is likely to continue and worsen, and accountability matters. The strongest argument for not reporting is that the practical consequence — loss of the only other health worker — could cause more harm than the individual theft. What I would do is speak directly with the colleague first, making clear that I know what is happening and that it must stop, setting a clear limit before escalating. I would want to know more about: whether the reporting authority would actually take action, whether there is any replacement health worker available, and whether the colleague might stop if confronted.
Award marks for: a genuine dilemma where both options have real moral costs — not a case where one option is obviously right; honest representation of the strongest argument for each side; a decision that is consistent with the analysis but acknowledges the remaining moral cost of the chosen option; and additional information that would genuinely affect the decision. Strong answers will resist the temptation to make the dilemma easier than it is — a student who acknowledges that their chosen option is genuinely imperfect demonstrates more moral maturity than one who claims a clean solution.
More options always make decisions easier and better.
Research by Barry Schwartz on what he called the paradox of choice shows that beyond a certain number of options, more choice produces worse decisions and less satisfaction. When the number of options is very large, people use less systematic reasoning, experience more regret about unchosen options, and are less satisfied with their final choice. Practical good decision making often involves limiting options to a manageable number before evaluating carefully, rather than trying to consider all possibilities. This is not a failure of decision making — it is rational management of cognitive capacity.
Emotions should be excluded from decision making — good decisions are purely rational.
Emotions are not a threat to good decision making but an essential component of it. Research by Antonio Damasio on patients with damage to the emotional processing centres of the brain shows they are unable to make effective decisions despite intact logical reasoning — because they cannot assign relative value to options. Emotions communicate what matters, what is at stake, and what our values demand. The goal is not to exclude emotions from decision making but to ensure they inform rather than override the deliberative process — to use emotional information alongside reasoned analysis.
The decision with the best expected outcome is always the right one.
Expected value calculations — averaging outcomes weighted by their probability — are useful but incomplete guides to decision making. They ignore risk aversion (people rationally prefer lower-variance outcomes even at some expected-value cost, particularly for irreversible decisions or decisions that affect survival), equity (who bears the costs and who receives the benefits matters morally, not just the total), and the limits of probability estimation (we often cannot reliably estimate the probabilities of outcomes for novel decisions). A decision that is optimal in expected value terms can be irrational if the downside risk is catastrophic, if the costs are borne by the most vulnerable, or if the probability estimates are unreliable.
Once you have made a decision, you should stick to it — changing your mind is weakness.
Revising a decision in response to new information or changed circumstances is not weakness — it is good epistemic practice. The relevant question is whether the revision is driven by new information (appropriate) or by discomfort, social pressure, or sunk cost reasoning (not appropriate). A decision made with good information and process should be revised when new information changes the analysis. Refusing to revise a decision because of pride or the desire to appear consistent is one of the most common and most costly decision-making errors — it is called escalation of commitment or the sunk cost fallacy in research on decision making.
Decision making at secondary level engages students with the deeper psychology, philosophy, and political dimensions of choice — how expert judgment actually works, how psychological research has revealed systematic departures from rational choice, how moral philosophy provides frameworks for ethical decisions, and how power shapes who gets to make decisions that affect others.
Research by Gary Klein and others on naturalistic decision making shows that experts in high-stakes domains (firefighters, intensive care nurses, military commanders) rarely use formal decision analysis. Instead, they use pattern recognition — drawing on extensive experience to rapidly identify which situation type they face and what has worked in similar situations before. This is neither pure intuition nor formal analysis but rapid expert pattern matching. The implication: for genuinely novel decisions or decisions outside one's experience, structured analysis is most valuable; for decisions that match familiar patterns, drawing on accumulated expertise is often more reliable.
Daniel Kahneman and Amos Tversky's prospect theory, the foundational behavioural economics model, identifies systematic deviations from rational expected utility maximisation: loss aversion (losses feel approximately twice as painful as equivalent gains feel pleasurable), the certainty effect (overweighting certain outcomes relative to probable ones), and probability distortion (overweighting small probabilities and underweighting moderate to large ones). These systematic biases have been replicated across cultures and age groups and have significant practical implications for personal, financial, and policy decision making.
The major ethical frameworks — consequentialism (outcomes), deontology (duties and rights), virtue ethics (character), and contractualism (what principles could be agreed to by all affected) — provide distinct decision procedures for ethical dilemmas. Each captures genuine moral insights and each has limitations. Understanding them as tools for illuminating different aspects of a decision rather than as rival theories that must be accepted or rejected in their entirety is the most practically useful approach.
Rational people always make better decisions than emotional ones.
The contrast between rational and emotional decision making is a false dichotomy. Damasio's somatic marker hypothesis demonstrates that emotional signals are necessary for effective decision making — people without normal emotional processing make dramatically worse decisions despite intact logical reasoning. Emotions carry information about values, risks, and past experience that purely analytical processes cannot replicate. The goal is not emotionless rationality but emotional information integrated with careful analysis. The most effective decision makers are those who can notice and use emotional signals without being overwhelmed by them.
If everyone in a group agrees, the decision must be right.
Group consensus can reflect genuine convergence on a well-reasoned position — or it can reflect groupthink, the suppression of dissent, social pressure, or information cascades (later participants deferring to earlier ones regardless of their private information). Research on group decision making shows that unanimous agreement is sometimes a warning sign rather than a reassurance — particularly in cohesive groups with strong social pressure to conform. The most effective group decision processes explicitly solicit dissenting views, assign devil's advocate roles, and evaluate the quality of reasoning behind agreement rather than treating unanimity as its own evidence.
Ethical frameworks provide definitive answers to moral questions.
Ethical frameworks are tools for illuminating different aspects of moral situations — not algorithms that produce definitive correct answers. They frequently disagree with each other on the same case. Even within a single framework, reasonable people applying it carefully can reach different conclusions when they weight values differently or interpret principles differently. The most honest and most useful approach to ethical frameworks is not to adopt one and apply it mechanically but to use multiple frameworks as lenses that together reveal more of the moral complexity of a situation than any single one could alone. Genuine moral wisdom involves holding this complexity with epistemic humility rather than seeking false certainty.
Better information always leads to better decisions.
More information does not automatically produce better decisions — and can produce worse ones if it exceeds cognitive capacity, introduces new biases, or creates false confidence. Information overload is a genuine phenomenon: beyond a certain volume of information, decision quality deteriorates rather than improving. Additionally, more information can increase analysis paralysis — the inability to decide because every option now appears to have more downsides. The relationship between information and decision quality is non-linear and depends on the quality and relevance of information, the cognitive capacity of the decision maker, and the time available for deliberation.
Key texts and resources: Daniel Kahneman's Thinking, Fast and Slow (2011, Farrar Straus and Giroux) is the most accessible and comprehensive account of the psychology of decision making — covering heuristics, biases, and prospect theory in depth. It is essential reading for teachers and suitable for strong secondary students. Richard Thaler and Cass Sunstein's Nudge (2008, Yale) applies behavioural insights to policy design and is the most accessible treatment of choice architecture. For naturalistic decision making: Gary Klein's Sources of Power (1998, MIT Press) documents how experts actually make decisions under pressure. Phil Rosenzweig's The Halo Effect (2007, Free Press) examines how outcome bias distorts learning from business decisions. Annie Duke's Thinking in Bets (2018, Portfolio) is the most engaging popular treatment of decision making under uncertainty. For ethical frameworks: Peter Singer's Practical Ethics (1993, Cambridge) is the most rigorous and accessible consequentialist text. Immanuel Kant's Groundwork of the Metaphysics of Morals is the foundational deontological text — challenging but important. Alasdair MacIntyre's After Virtue (1981, Notre Dame) is the most influential treatment of virtue ethics. T.M. Scanlon's What We Owe to Each Other (1998, Harvard) is the foundational contractualist text. For collective decision making: James Surowiecki's The Wisdom of Crowds (2004, Doubleday) examines conditions under which groups make better decisions than individuals. Cass Sunstein and Reid Hastie's Wiser (2015, Harvard) directly addresses group decision-making failures and solutions. For decision making under genuine uncertainty: Nassim Taleb's The Black Swan (2007, Random House) examines decisions in domains dominated by unpredictable extreme events — particularly relevant for decisions about catastrophic risk.
Your feedback helps other teachers and helps us improve TeachAnyClass.