Nick Casey is Head of Primary at Dulwich College Suzhou, with over 15 years of leadership experience across Australia and international contexts. A CIS-trained evaluator, he brings deep expertise in whole-school strategy, curriculum design, assessment systems, and professional learning. Having served as Deputy Head, Dean of Primary, and Whole College Director of Teaching and Learning, Nick offers a broad, systems-level perspective on school improvement. He champions relational leadership, evidence-informed practice, and high-performance cultures, and is recognised for driving sustainable improvement through strong community partnership and organisational coherence.
A central task of education is to develop thinking. Yet students now have access to tools that can complete many traditionally cognitive tasks, including drafting essays, solving equations, generating code, and summarising texts, with little more than a prompt. What began as a technological novelty has, in a short time, has quickly become embedded in everyday educational practice.
Unsurprisingly, early debate focused on academic integrity. If high‑quality work can be produced instantly, how do we prevent misuse? But cheating is not the most significant risk facing schools. The deeper challenge is cognitive offloading: the gradual outsourcing of thinking to intelligent systems.
For decades, researchers have observed how humans rely on tools to reduce mental effort. Calculators changed how we approach arithmetic. Search engines transformed how we access information. GPS reshaped spatial navigation, quietly replacing printed street directories, or, as we called them in Australia, the Refidex.
Research in cognitive psychology reminds us that effort matters. When students retrieve information, wrestle with ideas, and work through uncertainty, learning becomes stronger and more durable. When thinking is done for them, understanding is often shallow and short lived. Educational thinker Guy Claxton writes about building “learning power,” the capacity to persist, question, and think independently. These habits develop when students grapple with complexity and navigate challenge.
Artificial intelligence operates at a higher level than previous technologies. It does not simply retrieve information. It can construct arguments, summarise complex ideas, and generate interpretations in seconds. If students rely on AI to do this higher level thinking without questioning, refining, or challenging what it produces, they may complete tasks more quickly, but they miss the very process that builds analytical strength
Used deliberately, AI can deepen learning. It can help students test ideas, compare perspectives, and refine drafts. It can accelerate feedback and broaden exposure to viewpoints.
But when AI begins to perform the core reasoning itself, learning shifts away from the student. Efficiency increases, yet intellectual growth does not. Critical thinking develops through analysing evidence, weighing interpretations, and justifying conclusions. When that reasoning is displaced, those habits weaken.
The risk, then, is not the presence of AI in classrooms, but the gradual erosion of cognitive engagement if its use goes unexamined. Critical thinking has never been about speed. It requires time, ambiguity, and disciplined reasoning. In a world where answers are immediate, discernment becomes even more important. Students must learn not only how to use intelligent systems, but how to question them. They need to recognise that coherence does not guarantee accuracy, and that confidence in tone does not ensure credibility.
Increasingly, schools and international organisations are acknowledging that the rapid development of artificial intelligence demands principled and coordinated responses. The Council of International Schools has launched a Taskforce on Emerging Technology to help shape ethical and future‑focused guidance for schools navigating this terrain. I am encouraged by this work and look forward to seeing how these frameworks support schools in aligning innovation with their core educational values.
That alignment only matters if it is visible in classroom practice. Students do not automatically treat AI as a thinking partner. They will often treat it as a shortcut unless we explicitly teach otherwise. The difference lies in how we frame its use and what we require students to do with its output.
Increasingly, schools are investing time in teaching students how to use AI tools effectively. While these skills have practical value, they are not sufficient. Equal attention must be given to helping students understand what AI is, how it generates responses, and where its limitations lie. Without that foundation, students may mistake fluency for authority and coherence for truth. A curriculum that addresses both the mechanics of use and the critical evaluation of outputs ensures that students remain thinkers first and users second.
This clarity must extend beyond the classroom. Parents, as partners in education, need to understand how and why AI is being used so they can reinforce habits of questioning and intellectual ownership at home.
The cultivation of critical thinking has always depended on deliberate practice. Students learn to analyse by analysing and to reason by reasoning. A student might explain why a character feels left out, pointing to specific moments in the story. Another might analyse how an author’s perspective shapes what is revealed and what is withheld. In both cases, the thinking lies in selecting the relevant moments, inferring what they reveal, and judging how perspective guides interpretation, not in assembling a fluent response.
In an environment where AI can simulate these processes, schools must ensure that learners still experience them firsthand. This does not require rejecting innovation, but designing for intellectual engagement and asking, in every subject area, whether the selecting, inferring, and judging illustrated above remain with the student rather than being displaced by the tool.
If critical thinking is to remain central in an AI rich environment, it has to be cultivated deliberately over time. It is shaped gradually through repeated expectations about how ideas are handled.
From the earliest years of schooling, students are encouraged to explain their reasoning. They are asked not only for answers, but for the thinking behind them. They learn to notice when something feels unclear and to ask questions rather than accept statements at face value.
As digital tools become more sophisticated, those habits are extended rather than replaced. When students encounter AI generated responses, the expectation is not passive acceptance but active evaluation. They are invited to consider what might be missing, how a perspective has been framed, and whether alternative interpretations exist. In doing so, they begin to understand that technology can produce language fluently, but judgment remains human.
Over time, that expectation matures into accountability. Older students are expected to demonstrate ownership of the reasoning attached to their work. If they use AI to explore an idea or refine a draft, they must be able to articulate how it shaped their thinking and where they chose to diverge from it. The emphasis shifts from what was produced to how it was examined.
Throughout this progression, productive struggle remains protected. Students continue to read deeply, draft independently, and sit with complexity before seeking assistance. AI is positioned as a tool that can extend thinking after effort has occurred, not a mechanism that replaces it.
In this way, the presence of intelligent systems does not dilute our educational purpose. It sharpens it. Each stage of learning reinforces the same underlying message: reasoning belongs to the learner.
In the end, the question is not whether students will use AI. They will. The question is what habits of mind we cultivate alongside it. If intelligent systems make answers easier to produce, then our responsibility is to make questioning more deliberate, more visible, and more valued within our classrooms.
In an age of intelligent systems, preserving the habit of questioning may be one of the most important responsibilities we hold as educators.
