Sethi De Clercq, Computing Lead, Head of Key Stage 1, Rugby School Thailand

Sethi De Clercq is a passionate educator with experience across Early Years, Primary, and Secondary education. As Year 2 Teacher, Head of Key Stage 1, and Computing Lead at Rugby School Thailand, he leads with a focus on inclusive practice, high expectations, and meaningful integration of technology. Beyond the classroom, Sethi is a keynote speaker and trainer who has presented at conferences worldwide on EdTech and the future of education. He also runs the YouTube channel Flipped Classroom Tutorials, where he shares practical tools and strategies to empower teachers. His work centres on supporting educators globally to make confident, thoughtful choices about teaching, learning, and digital innovation.

 

We are currently navigating a fascinating, albeit at times overwhelming, era in technology. This past year, it has felt as though every software vendor, platform, and application has suddenly integrated artificial intelligence into its core offering. Tech companies are moving at breakneck speed, fiercely competing in an AI arms race that leaves many educators and parents feeling as though they are constantly struggling to catch up. The news won’t stop talking about how AI and its uses are impacting real everyday life. And above all, moral panic, not unlike that which we’ve seen when calculators came out, or the printing press revolutionised the spread of information, or the internet entered our living rooms is back for another round!

However, as school leaders, our most vital role right now is to step back, take a collective breath, and reframe the narrative. Artificial intelligence is not a looming threat to be feared, nor is it a passing fad to be ignored. Instead, it must be respected as an inevitable, world-changing technology, fundamentally shifting our landscape much like the advent of electricity or the internet and mobile devices.

Applying the Pedagogy-First Filter

Once we move beyond the initial anxiety surrounding this technology and shifting landscape, we can begin to lead with clarity. The goal is no longer about simply getting technology into the classroom; it is about acting as a critical filter between the aggressive push of the technology industry and the wellbeing of our school communities.

For years, there has been a tendency to adopt new technology simply for technology’s sake, driven by a fear of being left behind. Yet, true leadership in the AI era requires the confidence to say no. We must examine artificial intelligence and its impact with the exact same rigour and unbiased approach we apply to our most important school decisions, and in a wider sense we must approach this attitude when thinking of all technology integration efforts.

When we look at a new curriculum framework or decide which vendors we trust with personally identifiable information, we employ strict, uncompromising standards. AI tools must face this same scrutiny. We must apply a “pedagogy-first” filter to every piece of software. If an AI feature does not serve a clear, demonstrable pedagogical purpose, or if it obscures the cognitive heavy lifting we want our students to engage in, it does not belong in our classrooms. The learning objective must always dictate the technology, never the other way around. At the end of the day, technology companies are in the business of user-adoption and user-retention, not meaningful pedagogical impact, although some will no doubt claim this.

Rethinking Modern Safeguarding

This rigorous approach naturally extends into how we manage safeguarding. Historically, digital safeguarding in schools leaned heavily on web filters and restricted access. In the age of open information access and generative AI, this reactive model is no longer sufficient.

AI awareness must become, if it isn’t already, an embedded, foundational part of our safeguarding practices, staff training, and overall provision. It is no longer an isolated issue for the IT department to manage; it is a whole-school responsibility. We need to ensure our staff are equipped to understand the nuances of AI-generated content, the potential for algorithmic bias, and the critical data privacy implications of the platforms our students use daily.

Strengthening the Home-School Alliance

This brings us to a crucial, often overlooked element of navigating this landscape: the home-school-student partnership triangle. Parents are understandably overwhelmed by how fast the digital world is shifting beneath their feet.

We are raising a generation of digitally comfortable learners who can navigate a tablet interface with astonishing ease before they can tie their shoelaces. However, it is a dangerous misconception to confuse this digital comfort with digital literacy. Our students may know how to operate the devices, but they lack the critical compass required to evaluate a flood of generated content or understand the mechanisms driving the platforms they use.

As modern educational leaders, we must actively partner with families to bridge this gap. We need to provide parents with the reassurance and the practical frameworks necessary to foster a balanced, healthy digital life at home. This means shifting the conversation away from anxious, restrictive battles over ‘screen time’ and moving towards meaningful discussions about purposeful ‘screen use’. Concepts such as ‘over-sharenting’, ‘digital footprints’, and ‘online identity sub-cultures’ should not be avoided as topics for discussion.

By opening clear, continuous lines of communication with families, we demystify the technology. We can share the language and frameworks we use in school to evaluate information, allowing parents to reinforce those same critical thinking skills at the dinner table.

From Consumers to Creators

While it is easy to focus on the challenges, this shift also presents an opportunity to rethink how schools consume technology. We are currently witnessing the democratisation of coding and custom software development.

In the past, schools were almost entirely reliant on purchasing expensive, bloated corporate software packages that offered a one-size-fits-all solution to nuanced educational challenges. Today, the barriers to creating bespoke digital tools have been dramatically lowered. Educators and school leaders now have the capability to build custom, accessible software solutions in-house.

By leveraging these democratised tools, we can shift our schools from being mere consumers of corporate technology to creators of our own tailored, safe learning environments. Building our own solutions allows us to maintain absolute control over our data, strip away unnecessary features that distract from learning, and design interfaces that perfectly match our specific pedagogical needs. It is an incredibly empowering shift that allows us to take back control from large vendors and ensure that the technology we use is perfectly aligned with our school’s ethos.

Leading with Confidence

Ultimately, navigating the AI shift is a test of our leadership and our grounding. The technology will continue to evolve rapidly, and the noise from the corporate sector will only grow louder. By anchoring ourselves in rigorous evaluation, embedding modern safeguarding practices, building strong partnerships with our families, and seizing the opportunity to create our own bespoke solutions, we can lead our communities confidently into the future. We can ensure that as the landscape changes, our focus remains exactly where it should be: on the holistic growth, wellbeing, and education of the children in our care.

Content Disclaimer

Related Articles