Intelligence service: how schools are managing AI

Artificial intelligence is everywhere. Since ChatGPT launched in November 2022, it seems a day hasn’t gone by without a shouty headline about robots taking over our jobs and schools. While much of the press has been predictably negative with lurid stories about cheating and plagiarism, it has pinpointed the potential artificial intelligence (AI) has to radically change the education landscape, for pupils and teachers. So, what effect is AI having on education?

“We are just starting to talk about AI in schools, as teachers and students,” says Will Van Reyk, head of history and politics at North London Collegiate School (NLCS). Newspapers, he says, have reduced the story to, “give students an essay question and AI will give you an answer”. But that “narrative misses the points about what AI brings in terms of benefits”.

Mr Van Reyk is acutely aware of the challenges surrounding the introduction of AI into schools. Not least the demanding legal, ethical, compliance and safeguarding frameworks that schools must adopt to use it. But he is adamant that “if students and staff are taught in the right way then it could be very beneficial for both parties.

“AI offers a massive new dimension to students in the classroom,” he says, with benefits to student’s critical thinking, problem solving and scholarly research.

He cites a new AI tool, Perplexity – dubbed the ‘Google killer’ – which Amazon’s Jeff Bezos has recently invested millions in. “It’s an amazing knowledge research tool and will enhance any student’s ability to access high quality articles,” he says. “I think it’s the best AI tool in education.”

‘You can really bring history to life’

Mr Van Reyk already uses ChatGPT in his lessons, often for role play scenarios. Students can ‘recreate’ a given historical event – the Cuban Missile Crisis, for example – where they can assume the role of US president John F Kennedy and ask ChatGPT questions which the AI then answers. “You can really bring history to life,” he says.

And these interactions can throw up wider issues around AI that can then be discussed in the classroom.

For example, Mr Van Reyk explains how his class was studying Britain in the 1950s. The students asked ChatGPT to role model an average person from the era and it immediately said: “I go to the pub in the evening.” “The default response was male,” says Mr Van Reyk. This led to an interesting class discussion about the inherent biases in AI. “It is a reflection of a biased society, and it’s about teaching students to recognise that,” says Mr Van Reyk, adding: “Women are being cut out of the story in development of AI – as a girls’ school, this is really important to us.”

Mr Van Reyk says he is hopeful that the narrative will move on from the potential for cheating and “assessment itself will shift to assuming that all students are using AI and those who don’t use it well will not be as good as a student who uses it in the appropriate way”.

  Sudoku hard: February 10, 2024

‘Everyone is playing catch up’

Steve Birtles, head of digital teaching and learning at Eton College, is more circumspect about the advent of AI assessment. “Fundamentally, the way qualifications are assessed [in schools] with sit-down written exams – those are the things that need to shift before we can make huge changes to the way we approach teaching with AI,” he says.

Eton is using ChatGPT to enhance learning in the classroom. Its launch less than 18 months ago has meant “everyone is playing catch up”, says Jonnie Noakes, the director of teaching and learning at Eton. The reason this AI tool brought about such a sea-change in schools was twofold, he explains. Firstly, ChatGPT is an example of ‘conversational’ AI, which “has made AI accessible to anyone who can hold a conversation, so it’s available to our school-age pupils in a way it wasn’t before”. And secondly, ChatGPT has a 13+ age rating, unlike say Claude AI or Google Bard, which both have an 18+ rating.

Eton, just like every other school, had to decide quickly what their stance was on using AI in the classroom. “We decided that if we forbade it that would effectively close down the discussion about its proper use, and we think it does have legitimate and educationally sound uses if used in the right way,” says Mr Noakes. “We are trying to teach our pupils how to use it properly. Help them to understand the ethical concerns, help them to understand its limitations and really emphasise academic integrity and honesty.”

Mr Birtles says students have been pleasantly surprised when invited to use AI, a reaction he says of, ‘Oh, we are allowed to!’ Teachers, on the other hand, have been “more cautious”.

“We have been focusing a lot of attention recently on helping colleagues get up to speed,” says Mr Noakes. “For a lot of colleagues who are not technically minded – and I put myself in that camp – this can seem like an enormous challenge. But it’s a challenge people understand they need to be up for, because trying to ignore AI really isn’t an option.”

Eton has taken a ‘platform agnostic’ approach to teaching pupils how to use AI. In other words, developing pupils’ skills across a number of AI platforms. The key, says Mr Birtles, is “prompt engineering”. An AI tool is only as good as the question you ask it. Mr Birtles has created a cookbook of ‘prompt recipes’ for AI for students and teachers, scenarios to help them start navigating these new tech tools.

‘AI is just another tool’

Jo Sharrock, Head at Putney High School, GDST, believes it’s important that schools don’t jump on the AI bandwagon without working out what it is they want to achieve first. “AI has been around for a long time – the gamechanger is generative AI. There is a lot of hype around ChatGPT but for us, AI is just another tool, part of our modern scholarship approach,” she says.

  Today's political cartoons - March 29, 2024

She believes the use of AI should be something “we choose to use”, not a default or a given. And it shouldn’t come at the expense of developing pupils’ critical thinking abilities.

“I do think it’s really important that we stand up and say… just because you can do it doesn’t mean you should do it.”

She wants her students to understand their own “brilliant minds” first and remains wary of anything that de-skills or inhibits the natural learning process. “It is our responsibility to develop student’s minds in the way that we need to,” she says.

She cites Putney High School’s (PHS) Science of Learning curriculum which runs across all subjects from Year 7 to Year 13. It has two aims, says Ms Sharrock: “To foster awe and wonder in the brilliance of human mind, and to create curious, self- sufficient learners who are comfortable with struggle rather than scared of challenge.”

It’s all about “How can I get my beautiful brain to do beautiful things?” says Ms Sharrock. Based on current research, it’s taught through practical techniques, whether that’s orienteering with a map and compass in the lower years or studying how to unlock the power of the brain higher up the school.

Now into its sixth year, “we can see the success in our raw results”, says Ms Sharrock.

Summer 2023 saw the school’s best tally of 9-8 grades at GCSEs and the most A* achieved at A-level since 2011.

“But more exciting is that we now have girls who have gone through the Science of Learning curriculum who are absolutely thriving. They tell us: ‘It’s because we know how our brain works… you’ve given us the skills that take us way beyond the exam hall.’ Which is all we want to do.”

‘We have missed a step along the way’

Brighton Girls, like its sister GDST school Putney High, is an enthusiastic adopter of all things STEM with students coding from Year 2, design thinking plugged into the curriculum and AI being used in the classroom. “We want our students to feel from a young, young age that this is their world, their space,” says its Head, Rosie McColl.

But Ms McColl has a different, personal take on the advent of AI in the classroom and it relates to students with special education needs (SEN). “There’s so much talk at the moment about generative AI but I think we have missed a step along the way,” she says.

Her own 10-year-old child is neuroatypical and struggles with dyslexia and dyscalculia. The biggest barrier to his learning is putting pen to paper. He is fine when he can use a simple voice recognition AI tool, such as speech to text, which he can use to write fluent essays.

“We use this all the time,” says Ms McColl. “This is non-generative AI. We’ve leapt onto generative stuff but the non-generative AI is really powerful and could be really transformative at removing barriers to learning.” Even generative AI tools – Ms McColl cites Mindstone, an open-source AI platform that can simplify huge passages of text at a stroke – can be really helpful for students with SEN.

  Will AI porn transform adult entertainment – and is that a good thing?

However, as long as our “rigid exam system” insists that children are tested in final exams with pen and paper, Ms McColl says there’s an enormous fear that governs how children, particularly neuro-divergent students, learn.

“The exam system is driving everything,” she says. She would like to see less talk around generative AI and more focus on taking down the barriers for SEN pupils and harnessing the tools that have immediate benefits for them.

‘Making AI work for teachers’

But Tom Rogerson, headmaster at Cottesmore School, wants to go further, faster with generative AI as he believes in its potential as a “wellbeing augmentor for teachers”. Mr Rogerson received worldwide attention in autumn 2023 with the launch of Abigail Bailey, an AI chatbot with a female form who Mr Rogerson ‘appointed’ as a ‘principal headteacher’ to work alongside him at the West Sussex prep school. But behind the clever PR story lies a more serious intent.

Mr Rogerson says that from the first moment he discovered ChatGPT he saw the potential for a tool that could help the serious retention and recruitment crisis in schools by reducing “the paperwork pandemic” that burdens teachers.

“What I’m doing is trying to make AI work for teachers. That’s the bottom line. I want them to have to stare into a screen less so they can be with pupils more. I want my teachers to be well.”

And Mr Rogerson is putting his money, time and energy where his mouth is, holding the first conference about AI in schools at Cottesmore in May 2023 and a series of AI masterclasses – “a mega event” – for teachers across state and independent sectors in September last year.

Mr Rogerson was insistent that these events were cross-sector. His twin obsessions are wellbeing for teachers and AI inequity. It’s one thing for a wellresourced independent school to invest the time and money implementing AI, but what about a state school? For example, ChatGPT 3.5 is free but ChatGPT Plus with access to GPT-4 with voice activation isn’t – it sits behind a paywall. “We are not creating a superpower of AI at Cottesmore,” says Mr Rogerson. “We want everyone to become an AI superpower. We want to be inclusive.”

And as for Abigail Bailey, she is now ABI, Cottesmore’s ‘strategic leadership AI bot’. It turns out anthropomorphism is a no-no in AI so ABI no longer has a human form, but Mr Rogerson uses it regularly. “Educational leadership can be a lonely place,” he says, and ABI is a hugely useful tool, a sounding board and sense checker at 1am in the morning “when your deputy head is fast asleep in bed as they should be”.

This article first appeared in The Week’s Independent Schools Guide Spring/Summer 2024, edited by Amanda Constance.

(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *