by Simon Hamblin
When ChatGPT was first released in 2022 many people were concerned that Artificial Intelligence would be used by students as another plagiarism tool. Discussions in the education industry mainly centred around how to create AI-resistant assessments and how to ban its use in schools. As a Digital Learning and History teacher, I was intrigued not by the potential drawbacks, but by the immense possibilities AI could offer to improve the classroom experience.
Cyborgs or Centaurs – what does AI mean for the classroom?
The Boston Consulting Group has highlighted innovative uses of AI in the workplace, distinguishing between Centaur behaviour, where tasks are entirely handed over to AI, and Cyborg behaviour, where AI is integrated into human workflows to enhance performance. This Cyborg approach made me think of how professional chess players started playing with computers rather than against them. The computers handle the heavy number-crunching, letting the players focus on strategy and play more creatively. This teamwork between humans and computers not only made them better players but also changed the game of chess itself.
I hope for a similar integration of AI in education, where AI could handle routine administrative tasks, allowing teachers and students to focus more on the creative and critical components of learning. It’s about using AI to make learning better, not letting it do everything for you. I hoped my research would show that when students use AI as a tool and not a shortcut, they get more involved and do better in school.
Developing Cranbrook-specific AI
I started a research project with AIS NSW to build a Cranbrook-specific AI product that could be used in the classroom as a feedback tool. With limited coding experience I was fortunate enough to get support from Professor Danny Liu at Sydney University. From a teaching perspective, the problem with using ChatGPT is that it operates like the Wild West. Its three major downfalls are that the product is biased, incorrect and unemotional. I thought that by building a custom chatbot we could counteract these downfalls to use AI technology in a safe environment, with enormous potential for teaching at Cranbrook.
With the chatbot’s downfalls in mind, I attempted to build and define parameters within the technology to create a custom product that overcame the inconsistencies and bias of data by providing it only with the information it needs. Teachers can control how students interact with AI. They can add in parameters for each task, as well as only the information needed by students, which could include a textbook, academic paper, novel, or a compilation of resources, so that the threat of data inaccuracy is removed. Teachers are also able to suggest the framework for each task, whether they are looking for more creative responses or how to use evidence in academic writing. Importantly, the AI we created never gives the student the answer, but instead, asks students a series of questions that prompts them to improve their work and then attempts to teach them the content if required.
Testing how the chatbot can improve the classroom experience
As a case study for my research in my Year 8 History class, students were asked to submit a paragraph about the importance of Viking social structure in an academic style. Students were then asked to submit the paragraph for marking. Without using AI, I would generally spend a lot of time focusing on simple things like paragraph structure and clear use of evidence. However, I designed the chatbot as a first draft submission to question each student about the structure and content of their writing and encouraged them to make improvements before they submission their work to me.
It was great to see the level of student engagement, with all students attempting to improve their work beyond what they usually would. However, only slight improvements were made to their work. It was not the great leveller that had been promised with all students writing like top students. My research showed that this came down to their limited AI literacy – skills in harnessing AI to improve their work. The brilliant thing about a chatbot is that you can talk to it using natural language, like a personal tutor. Yet in my research, only three students in the class asked the AI a question about their work or asked for further instruction on how to implement the feedback. Those students who did probe the AI with questions saw the most increase in quality of their work, which was great to see.
Training is imperative
I was initially disappointed that I wasn’t witnessing a generation of Cyborgs being born. If anything, my research demonstrates that AI potential doesn’t always translate into immediate success. We cannot assume that students are skilled in using digital tools like AI even though they are digital natives – having been raised with devices. Students and teachers alike need training and support in using AI within classroom so that they can harness its potential. The hope is that with the right training a tool like this will be effectively used by all teachers and students. After further testing of the app, and staff and student training planned for Term 3, I hope to see AI effectively used at Cranbrook later this year.
Simon Hamblin is Cranbrook School’s Digital Learning and History teacher.