T: 01257 267677 E: hello@euromediaal.com

THE DEDICATED EDUCATION MAGAZINE FOR HEAD TEACHERS AND EDUCATORS ACROSS THE UK

Are we doing enough to ensure our school communities have sufficient AI training and resources? 

By Katherine Howard, Head of Education & Wellbeing at Smoothwall 

Many of us now find ourselves well into the rhythms of the new school year. Yet, amongst much of the consistency and familiarity, it feels as though the battle to protect children from online harm is becoming tougher and even more unpredictable from one year to the next.

Keeping up with digital advances, particularly AI and its negative connotations and applications, requires time and most importantly, resources; both of which we all know are in short supply. So, while the dedication and commitment of school leaders and Designated Safeguarding Leads (DSLs) continues as the fundamental backbone to protecting students online, it’s critical to ask ourselves: are school communities truly being supported and adequately equipped to protect their pupils and themselves from the benefits and dangers of AI? 

It is of course important to say that AI has the potential to be – and in many applications of learning already is – a powerful and effective tool for good. Within the last few years alone AI has been deployed to enhance academic development, child safety, inclusivity and organisational efficiency. But we are fast approaching a scenario where if left unchecked and under supported the good will soon be outweighed by the bad.

As the volume of AI applications grows, school communities are understandably feeling increasingly overwhelmed and underprepared. Without proper training and support, it will become more difficult to shield students from growing risks, such as exposure to inappropriate content and the sharing of explicit materials. 

Qoria recently surveyed school communities in the UK, US, Australia and New Zealand to determine the level of understanding around the impact of AI, specifically as an enabler of creating harmful content and CSAM. The survey found that almost two-thirds (64%) of respondents reported that limited staff training, knowledge and time were the primary challenges they faced when addressing the dangers posed by AI, CSAM, and the sharing of explicit content among students. This is a clear call from those in the school community for additional resources to enable the delivery of safeguarding roles more effectively.

One UK college leader surveyed said “AI is a rapidly growing area that as a college we are trying (and probably failing) to keep up with.The goalposts move so fast that we are constantly playing catchup”.

Navigating these challenges is no small task for any educational organisation, however big, well funded or professionally staffed, to overcome. For those on the frontline, it is essential that the whole school community – from leadership teams to students – are fully aware of the potential issues, understand them and ‘tooled-up’, in order to be properly prepared to deal with these challenges.  

The government has a role to play here too. School communities should not be left to navigate these challenges alone. They need clear guidance and resources from policymakers in the form of regulation to ensure that schools can access the training and support they need. The UK government does currently invest in AI in education, yet its focus is on the aforementioned benefits AI can provide to the quality of education. It needs to acknowledge the challenges it poses to students’ online safety and work with the school communities to provide regulation, policies in place and provide educators with the support and guidance they truly need. 

Supporting the school community now

While some school communities may feel helpless, there are many proactive and proven approaches to address this growing threat. Whilst we wait for the government and wider communities to switch on to the issue, there are things schools can do: 

Develop an AI working party – we’ve heard from many schools that they are setting up an AI working party. This is a great way to align all of the stakeholders around a common understanding and put strategies in place to support the whole community. Regular meetings and discussions will also improve communication, empower staff, enable shared responsibilities, and promote professional development in AI, risk management, and response.

Review and update school policies – by ensuring that your incident management procedures include those relating to AI-based incidents, with particular reference to victim support. Schools can also review their student curriculums to include education on AI, as well as explore its positive applications. 

Increase staff training – a better understanding of these issues would allow everyone involved to be better equipped to detect, intervene, and educate about the risks and consequences associated with AI. 

There are steps you can take to invest in ongoing professional development and tools for staff, such as scheduling targeted and regular training on the latest digital trends, understanding the psychological and social dynamics at play that underpin poor online behaviours by students, or policy sessions focused on intervention and support strategies through the tooling they use. 

Extend education to parents – measures that are put in place to educate, support and protect staff and students need to extend beyond the school community. By involving parents and guardians, you can make sure that awareness will help provide support and protection in the home environment too. 

Increase digital risk visibility with technology – those involved in digital safety within the school community know that they don’t have enough eyes and ears to monitor the situation effectively. Digital monitoring solutions, which are often supported by human moderation, can provide real-time insights into potential risks such as exposure to harmful content, inappropriate conversations and online grooming. These solutions can help you identify issues early to prevent any escalation at all. Crucially, this technology allows the school community to be informed, providing a swift and accurate intervention or response. 

There are content filtering solutions available that intervene earlier, preventing students from accessing online content that could be harmful to either their physical, mental, or academic wellbeing. 

School communities provide hope

There is no doubt that AI has introduced benefits to education, however it has also introduced significant new challenges for the school community when it comes to keeping students safe online. But there is reason for hope.

Schools have always played a key role in supporting their communities in times of difficulty. Today, they are recognising how technology and educational strategies can have a positive impact. By leaning into these tools and focusing on a collaborative and strengths-based approach, schools can empower students and their communities with the knowledge and skills they need to thrive in online environments, focusing not just on risk, but on the resilience and potential they have as young learners and responsible digital citizens. 

With the right support and guidance, children will harness the numerous benefits of AI while mitigating its concerns. By shaping their digital experiences and fostering a culture of digital citizenship, schools can build a safer and more positive online environment for future generations. 

Together, we can embrace the opportunities presented by AI and pave the way for a brighter, more secure future for our children.

Edit Template
Wernick Buildings

Subscribe to the QA Education Newsletter

QA Education is GDPR compliant

Edit Template

QA Education is provided be Euromedia Associates Ltd
UK Registered Company Address: 10 Ashfield Rd, Chorley, PR7 1LJ

Tel: 01257 267677  Email: hello@euromediaal.com
Registered Company No: 02662317 VAT Registration No: GB582161642

Information

Euromedia Associates Publishers of QA Education Magazine
Euromedia 33 Years in business logo 1990 - 2023
Guaranteed Royal Mail distribution
Royal mail
Website and all content Copyright © 2023 Euromedia Associates Ltd All Rights Reserved.