AI edtech has exploded in popularity over the last two years. And while there are many promising tools, there are also many that are overhyped, poorly designed, or just unnecessary, however well-intentioned. Los Angeles USD learned this the hard way recently when the district had to shelve a chatbot system it had spent $3 million on only a few months after it launched because the company that had developed it had laid off half its staff.
To help educators avoid these AI snake oil sales and properly vet AI tools, we turn to Tal Havivi, managing director of research and development, and Joseph South, chief innovation officer, both at ISTE/ASCD.
Here are their tips for properly vetting AI tools.
1. Beware Of AI Products That Over Promise
If you see an AI tool advertise that it will do teachers work for them, you should probably approach it with the same skepticism you would a product claiming to turn water into wine.
“Anyone who implies that their product will make teachers obsolete is overhyping,” South says. “AI is a powerful companion to an educator, but not a replacement for their job or their judgment.”
South adds that AI edtech tools should meet the same privacy standards as any other tool. “If a provider can’t give you ready answers to questions like ‘How do you protect the privacy of the users?’ ‘Is this product FERPA compliant?’ ‘How do you handle PII and where is it stored?’ then they are probably not ready to sell to educators.”
In addition, AI needs to meet the same standards around accessibility as any other tool a school might utilize, South says.
2. Focus On Existing Learning Goals
“It is so important that we don’t get distracted by the glitz of AI,” South says. “AI can do some really amazing things, but fundamentally, AI is limited to making predictions. In the end, it’s the humans that need to make judgments about what to do with those predictions.”
Because of this, South says, the best place to start is not with the technology but with your learning goals. He advises asking yourself the following questions: “What do you want the students to learn?” “How do you want them to learn it?” and “What evidence would you need to see to determine that they did indeed learn it?”
He adds, “Once you have a very clear idea of those three things, then you can go to the AI marketplace and see which solutions best match your learning goals and pedagogical approach.”
3. Watch Out For Hallucinations
Inaccurate or inappropriate AI hallucinations have been well-publicized overall, but these can be particularly damaging in an educational setting. Havivi advises looking for AI tools that allow educators to have some control over what AI can and can’t do.
“Since GenAI does not ‘think,’ it’s important to see if products have controls in place, and controls school systems and educators can use, to further reduce unexpected/unwanted output,” Havivi says. “Educators should ask vendors what feedback loops are in place to help product developers improve controls based on educator experience and use of the product.”
South adds, “Virtually all providers focused on the K-12 market are taking steps to make their product safe and effective to use, but the depth and consistency of those efforts varies widely.”
He suggests reviewing the tool’s AI prevention plans and then conducting a “red team” exercise. “You and your team log in as a typical user would but with the sole intent to try to break through those protections as best you can to verify that they are working as designed,” he says. “AI should never be the last word on any fact. It is the user’s job to verify anything the AI suggests. Remember, generative AI is simply predicting the next word. It does a mind-boggling good job of this, but it actually doesn’t know what it is saying.”
4. Avoid Long-Term Contracts
When it comes to signing on with an AI edtech tool, being a little commitment-shy is probably a good thing.
“The AI tool provider market shakeout hasn’t happened yet,” South says. “There are far more tool providers out there than the market can sustain. Signing multiyear contracts are a bet you are making that the provider will be here in 24 months, which may or may not be the case.”
In addition, you want to make sure your contract includes free upgrades as the technology improves. “AI is evolving so rapidly that today’s functionality is likely to feel antiquated in just a few months,” South says. “You don’t want to be stuck contractually with the current version when everyone else has moved on to something twice as fast and twice as good.”
5. Make Sure Your Approach To AI is Evolving
Of course, as with many aspects of education, the solutions to properly integrating AI aren’t one-size-fits-all and are not static.
“For vetting products, it’s particularly important to understand the evolving capabilities and limitations of generative AI,” Havivi says. “One challenge is that the capabilities and limitations of generative AI are not fixed.”
Early large language models, such as GPT-3 and even GPT-4, were particularly good at tasks requiring probabilistic or heuristic reasoning, in which there were “lots of paths to a ‘good enough’ answer,” Havivi says. These models were competent at tasks such as summarizing content or generating creative text, but not great at precise and predictable answers to things such as complex math problems.
“That’s changing — models are improving,” Havivi says. “This rapid evolution means that evaluating AI tools will be dynamic and it is all the more important to ensure that the fundamental parts of evaluating edtech remain consistent.”