Implementing artificial intelligence in schools can be a difficult discussion to have. As AI grows and evolves, how we use it changes alongside it. How can we keep up with such a malleable landscape while addressing any issues that may come up with its use?
Kerry Gallagher, Assistant Principal for Teaching and Learning at St. John’s Prep in Danvers, Massachusetts, expounds on a method she created to help acknowledge and implement AI use in her district, as well as to make an ongoing commitment to staying involved and informed with its use.
As the recipient of a Tech & Learning Innovative Leader Award for Most Innovative Assistant Principal, presented during the recent Northeast Regional Summit, we have the privilege of learning more about Gallagher’s method of understanding and implementing AI in her school and doing so in an inclusive manner.
Staying Open-Minded About AI
Overcoming the early media coverage of AI in schools was a significant challenge, Gallagher says.
“I think most of the media attention [for AI] at the time was pretty negative and meant to scare people,” she says. “So we had some colleagues who were scared. We also had some colleagues who were curious, and we had some people who had a mix of all of those emotions. We started by listening first to our colleagues and what their concerns were. We then started to take notice, with a very trained eye, of what types of work students might be submitting, and trying to identify what ways we can discern whether a student has used AI in the course of creating his work.”
Understanding both the concerns educators had as well as the ways students used AI helped Gallagher get a better idea of how to approach the idea of AI in schools.
“In hindsight, I’m proud of the way we didn’t rush to a ban or a policy or creating guidelines before we approached the situation with curiosity,” she says. “We looked at our academic integrity policies and our data privacy policies, then we did some work about how those existing policies apply to AI.”
Regardless of who was involved in the process, everyone was made aware of what was going on with AI within the school.
“We shared what we learned with students, parents, and teachers all at the same time, on the same day,” Gallagher says. “Everyone who is a part of our school community, no matter what their role, was on the same page.”
Addressing AI Through Existing Policy
As it turned out, new policy did not have to be created for AI as existing policy already handled its use among students, as Gallagher explains.
“We also talked about how our academic integrity policy requires students to cite sources if they use a source besides themselves,” she says. “That would include generative AI and it requires them to not use computer-generated materials, unless they have the permission of their teacher. That particular wording was already in our integrity policy, so we didn’t have to create anything new. We just helped people understand how to apply generative AI.”
Throughout the time spent going through the cycle of integrating AI, Gallagher implemented the academic integrity policy whenever it was deemed necessary to do so. She also kept families in the loop as to where the school was in the integration process.
“And that’s how we close the school year, applying our policies as we promised we would and allowing people who were really interested in exploring [AI] to explore on a small scale,” she says.
Building On the Existing Model
After creating an initial AI integration cycle over last school year, Gallagher expanded on it, honing in on specific AI platforms.
“This school year, we took the next step forward and we decided to focus on finding the generative AI tools that would be the tools for our teachers at our school that we could sign off on,” she says. “[The tools] would be something that would be of use to our teachers, and that we could do more dedicated professional learning on it so our teachers felt more prepared. We did select a tool, we rolled it out, we trained all teachers on it across the board.”
Training all teachers on the use of AI helped make it a more accepted concept, even if everyone doesn’t use it in their classrooms.
“As a result of [the widespread training], in addition to ongoing listening sessions and professional learning, I’m confident in saying we have normalized the use of generative AI by our teachers in the course of their work,” Gallagher says. “I feel confident that our teachers are using it in a way that is aligned with our mission and our sense of ethics as a school. And the reason I have that confidence is that the platform we chose allows us to see what our teachers are doing. We’ve given teachers the opportunity to share with us what they’re doing as part of our routine and structure.”
With a more stable structure in place, Gallagher says that AI elicits more positive feelings this time around.
“I think the excitement is growing. The curiosity is growing,” she says. “The feelings of fear and anxiety are not gone, but they are easing. So the next step is doing the same cycle of work around what a student might use AI for. But we need to do the same culture work that we did with our teachers. We need to start by listening and not assuming that we know what people’s concerns are. We need to start by training everyone, communicating effectively, and applying our values to what we are seeing.”
This approach to AI helps to mitigate inherent biases and fears about its use while also fostering a learning environment for teachers, students, and parents. Turning apprehension into curiosity, and without compromising on school policy, Gallagher has safely put AI on the radar and ushered in its use through professional development and open dialogue.