How states are guiding schools to think about AI

- Jennifer A. Kingson, author ofAxios What's Next

Illustration: Natalie Peeples/Axios
Just two states — California and Oregon — have issued policy guidance for schools on artificial intelligence platforms such as ChatGPT, a new report finds.
Why it matters: Teachers and administrators are eager for guidelines on how to use AI — and how to quash misuse. But the field is moving so rapidly that governments have been loath to issue pronouncements.
Driving the news: The Center on Reinventing Public Education (CRPE), a nonpartisan research center at Arizona State University, asked each of the 50 states and the District of Columbia to share their approach to AI guidance.
- Only California and Oregon issued official guidance for the current school year.
- 11 states are currently developing guidance: Arizona, Connecticut, Maine, Mississippi, Nebraska, New York, Ohio, Pennsylvania, Virginia, Vermont and Washington.
- 21 said they didn't plan to offer guidance for the foreseeable future, and 17 didn't respond.
Details: AI in education is such a sprawling topic that it's hard for educators and regulators to know what to tackle first.
- It encompasses everything from teaching the subject in classrooms to using AI for school operations (scheduling, ordering supplies) to cracking down on plagiarism and kids using it to do their homework.
- Current guidance focuses "on the ethical and equity implications of AI's use, recommendations for teachers' and students' appropriate use of AI, and emerging best practices to enhance instruction," per CRPE.
The intrigue: There's a split between states that plan to issue AI guidance and those that want to let individual districts decide their own policies.
- In the latter camp are Montana, Iowa, North Dakota and Wyoming.
- By contrast, "North Carolina and Vermont acknowledge districts' decision-making power, but are still developing guidance to inform their actions," CRPE said.
What they're saying: "We found that there was very little information that states were putting out," says Bree Dusseault, principal and managing director at CRPE.
- "We've defined it pretty broadly," Dusseault told Axios. "Really, what we're looking for is any official statements or explanation of what AI is; how it might apply to the classroom; how schools, teachers or system leaders might be thinking of building effective or ethical use practices."
California and Oregon's guidance was published after CPRE first polled on this topic in August.
- Back then, superintendents told CRPE that "they're a little overwhelmed by AI," Dusseault said.
- "They are really heads-down dealing with post-pandemic recovery, learning loss and mental health, and the introduction of AI has felt like another big question that they would like help with," she said.
Yes, but: There are guidelines out there for those who seek them out.
- Code.org, an education nonprofit, has put out a fairly comprehensive toolkit with sample guidance on the use of AI.
State of play: While schools recognize the importance of teaching AI and allowing students to use it as a research tool, they're perplexed about how to keep the genie properly in the bottle.
- Examples of misuse abound. Boys at a New Jersey high school are accused of circulating AI-generated nude images of female classmates, for instance.
- Bias is also a concern, given that generative AI programs are trained on models that can filter information in wrong or inappropriate ways.
- Teachers and administrators — who are short-staffed to begin with — are all at various stages of the AI learning curve.
Where it stands: President Biden's recent executive order on AI addresses education.
- Biden called for the creation of "resources to support educators deploying AI-enabled educational tools, such as personalized tutoring in schools."
- At the same time, private sector groups like InnovateEDU and its EdSafe AI Alliance are working to advance the responsible use of AI in education.
"Leaders and teachers and even students and family members are looking for practical guidance," said Amanda Bickerstaff, founder of a consulting company called AI for Education and a member of the EdSafe Alliance.
- Bickerstaff, who gives workshops on generative AI for everyone from superintendents to high school students, says that people are thirsty for the latest information.
- "We just want to bring down the uncertainty. We want to bring down the stress level," she tells Axios.
Between the lines: AI technology is advancing so quickly that any guidelines for schools must be purposely vague and flexible, lest they grow obsolete quickly.
- "We are hearing from our superintendents that it is a huge, ongoing and constantly evolving conversation," said Noelle Ellerson Ng, associate executive director, advocacy and government, for AASA, the School Superintendents Association.
- A lot of the experimentation is playing out in classrooms, she tells Axios. Teachers want to know, "How can you integrate AI into an immersive experience in every subject? How are we teaching them to use AI and have good ethics as academics?"
- Teachers must emphasize that AI is "not an inherently bad tool," but one that can be used to teach critical thinking, said Ellerson Ng. Students must learn to evaluate if a source is valid or not, she said.
The bottom line: AI is developing faster than school districts can establish training programs and guidelines — and while state guidelines may prove helpful, they're not likely to be the final word.