A new significant date?
I’ll start by giving you three dates, and ask you to tell me what historically significant things happened on these days: 25 December 1066; 11 November 1918; 30 November 2022
What do these three dates have in common, other than falling in the last two months of the year?
The first two are pretty well-known dates in Western historical circles; one is the date of the coronation of William of Normandy in Westminster Abbey in 1066, and the other, the day on which the armistice to end World War One was signed at Compagnie in France.
The third and – currently – least well-known date has the potential to be as significant as the other two.
This was the day when Open AI released their chatbot Chat GPT 3 on the unsuspecting world.
Why might this date bear as much significance as a coronation and signing of paperwork in a field? On these days the shape of the western world was remoulded, beginning a ripple effect which influenced the economies, cultures and the daily experience of citizens across the globe.
The impact on educators from the first two dates has been codified into curriculums, schemes of work and exam questions.
The impact of the third date on students and teachers is transformational. The challenges ahead are somewhat unknown – potentially heralding a whole new way of teaching and learning on the horizon for many school-based subjects.
AI-botted GCSE questions
These horror stories have painted AI as a transformative wave that must surely bring educational assessment reform to everything in its wake.
But what might the response look like? Education bodies and institutions have three broad options in how to deal with AI. I write from the campus of a Russell Group university, at which we are having many fruitful conversations around academic integrity and authenticity of work.
The three options are to :
a) cut the AI pipe off at source and ban its use at all stages;
b) we attempt to control and limit it, akin to working in a controlled sandbox; or
c) we let it live in the wild and accept that for all its strengths and weaknesses, it is a Pandora’s box – we can neither close the lid, nor fully control what issues from inside.
What are the current challenges for History teachers in the state sector?
There are some positive challenges in using AI to bring the curriculum to life. If you want a Q&A with a deceased historical figure, that’s now an option.
Once the ethical issues of interviewing Michael Schumacher’s AI responses are dealt with, it is entirely possible to question someone from the trenches in World War One, discuss the Spanish Armada with Drake, or quiz a barber/surgeon delivering rudimentary medical care out in the American Wild West.
If you’re interested in reading a poem on the Brexit vote in the style of Chaucer, or a BBC news article on the great fire of London, these can be created in a matter of seconds.
A further area to enhance the delivery of the curriculum can be found through using AI to create digital artefacts.
This can be a powerful and positive application of technology in the classroom.
For example, it’s possible to generate high quality propaganda posters from Word War One and Two – students identifying key insignia and discussion points that are needed as prompt questions can shape higher-order thinking skills required in exams.
Similarly, being able to generate a relatively realistic picture of a Viking longboat replete with Vikings, or a photo of William Harvey discovering the circulatory system provides a near endless set of bespoke images that students will not have used before, against which they can apply a critical mindset.
AI can be used to mark text and suggest improvements. By giving many of these AI bots (Chat GPT and Google Bard are the current brand leaders) a rubric and a mock answer, and ask for comments and improvements, a virtuous cycle of on-demand marking tools may provide stretch and challenge to all pupils (although there are substantial limitations, discussed later).
This brief glimpse into the future aligns with the current Secretary of State for Education’s thoughts around making AI responsible for the heavy lifting of planning and marking – but that is a conversation for another day.
What is the immediate impact on educators?
Reflecting over the first ten month of 2023, suspicions and fears around the immediate adoption of AI as an always-on exam mill and virtual sweatshop have not yet come to fruition.
There’s currently no substantiation of students using AI wholesale – although anecdotal evidence is hinting at schools’ and universities’ concerns around the authenticity of submitted coursework work, but as yet unable to prove AI influence.
There is not yet enough evidence that AI plagiarism detectors are sufficiently nuanced to accurately detect whether work submitted via an NEA is human or computer generated.
Might this presage the demise of the NEA in favour of more rigorously authenticated alternatives (online or controlled assessment) coming to the fore?
A further consideration might be to place greater value on oral exams. If workable, this approach could play into an Historian’s skillset, as one who is comfortable with defending their position and producing balanced arguments.
Can AI influence be seen as just another issue in teaching, alongside students using calculators in Maths exams, or lazy research methods such as defaulting to Wikipedia or the first Google return as a single source to read and cite?
The danger many in education are facing is that AI responses generated to essay style questions can look convincing, and emulate an authentic student voice.
When we unpick the AI answers more carefully, it is clear that they can be churned out en masse (similar to an essay mill), with hollow responses to reflective questions produced on demand.
Such techniques do not reap the skillset that History teachers choose their careers to hone, and so a focus on developing student research and essay writing skills becomes ever more important.
The JCQ response
In March 2023, the Joint Council for Qualifications (JCQ) provided guidance around AI in assessment, having reminded teachers of the need to upskill students and staff to become critical users of this technology.
Equipping staff and students to be aware of the impact of using AI appropriately is a skill that will be as important as computational thinking and digital literacy.
Current and future generations becoming able to use AI effectively, critically discerning its output and limitations (rather than adopting a default fire-and-forget approach) will take training and re-enforcement from primary school upwards.
While some might argue that the responses from JCQ and the government have not been definitive, it seems clear that the AI genie is out of the bottle.
Like it or loathe it, the education sector must develop solid policies. Such policies may be agnostic to specific AI bots, but must provide teachers and assessors with a workable template for today and tomorrow.
The issues around AI and terminal assessment are hugely challenging, and their pertinence to History departments can only be briefly discussed here.
The fear of plagiarism has been well considered, and commented upon in the mainstream media.
Another frequently cited anxiety is the outsourcing of knowledge to an AI model in the real world, thus the skills that underpin History could become redundant.
I don’t believe that this will be the case as history teachers have the skills to adapt to future challenges.
The skills required that AI can’t demonstrate (critical thinking and essay writing, for example) will simply become more pronounced, along with practical reforms to NEA and coursework.
Regarding the reform of NEA and coursework, in May 2023, Ofqual’s chief regulator suggested that a potential solution to the use of AI could be for students to complete all coursework in exam conditions, thus removing the opportunity to use AI in essays.
Whilst many teachers may shudder at the thought of this, some exam boards do offer online assessment replacing coursework with controlled assessment, albeit not (yet) for History.
This approach replaces the coursework element with an environment that can be managed safely. Does this point towards a different way of teaching and assessing?
Dangers and opportunities?
The danger of AI engines filtering results could be seen as a limitation on free speech in the History classroom.
When using AI, some popular image creation websites implement heavy filtering (integral to the skillset of History), where certain leaders such as Stalin and the current Chinese leader Xi Ping are banned from being generated in selected tasks, whereas the recent examples of Donald Trump being arrested or the Pope wearing fashionable (and impractical) clothes highlight more of the challenges that this virtual wild west is giving us.
There are rules that are being worked out – and rules that some have disregarded, with this lack of framework.
If we are thinking out of the box, AI could generate some rewards ahead for History teachers.
Reforming assessment to incorporate personalised learning and changing the assessment methodology provides the opportunity to allow adjustments to increase accessibility and inclusivity for all History students.
Ultimately, for all historians, the engagement with AI is not going to disappear.
We need to be able to understand what underpins good teaching and learning, as well as the higher order skills of analysis and evaluation and the understanding and remembering of concepts which are targeted in written prose responses at GCSE and A-level.
The landscape ahead may no longer be populated by NEAs as we currently know them, but new approaches that allow for trustworthy assessment, and equipping students with skills in discernment – enabling them to accurately decipher fact from fiction.
Responses must be centrally co-ordinated – wider than a single department or a lone school champion. Until the dust starts to settle, we remain in limbo without definitive answers to these challenges.
However, a final, conciliatory thought is that the universal accessibility of AI bots has empowered the ill-equipped student.
Granted, there is a skill in creating prompts (that’s a whole different essay), but prompt-writing ability notwithstanding, AI theoretically democratises education.
As per the mission of Sora schools, students can be set free in the process of teaching and learning, and be provided with the opportunities and freedom to learn and discover things by themselves.
Our role as educators is to engage in this uncharted territory, helping our students as they navigate AI.
While the landscape is new, the skill to map it is a function of your expert knowledge.
The wider query of what and how we assess in the school system assessment in a world of AI is the big question – that this has been drawn into focus may be resolved by such well-established ability.
Jonty Leese is an Associate Professor within the Centre for Teacher Education (CTE) at the University of Warwick. He helps plan, co-ordinate and deliver the secondary PGCE with a focus on Computer Science. He works with Project Enthuse and the NCCE and he sees himself as a futurologist.