AI Governance
Creating an AI acceptable use policy for your school
A 2025 CoSN survey found that 43% of K-12 districts still lack formal AI policies. The percentage is probably higher for private schools, where there’s often less central coordination.
This isn’t because administrators don’t care. It’s because creating good policy is genuinely hard. Technology moves fast. Stakeholder opinions vary wildly. Nobody wants to get it wrong.
But waiting has costs too. Every week without clear guidance creates more confusion for teachers. More inconsistency in how AI use is handled. More opportunities for conflict.
Why your current approach probably isn’t working
If you don’t have a formal policy, you’re probably doing one of two things.
Some schools are trying to ban AI entirely. This doesn’t work. Students use ChatGPT at home on their phones. They’ll find workarounds. All you accomplish is pushing AI use underground where you can’t see or influence it.
Other schools are handling things “case by case.” This creates inconsistency. One teacher treats AI-assisted work as cheating. Another encourages it. Students get different messages depending on which class they’re in. Parents don’t know what to expect.
Neither approach serves students, teachers, or families well.
What a good AI policy actually covers
Based on our experience helping schools develop AI governance, here’s what should be in your policy:
Definitions
What counts as AI? This seems obvious until you realize students are using AI features in Grammarly, Google Docs, Canva, and dozens of other tools. You need to be specific about what you’re regulating.
Student use guidelines
When can students use AI? The answer shouldn’t be “never” or “always” - it should depend on the assignment and learning objectives. Be explicit about categories:
- AI prohibited (original creative work, in-class assessments)
- AI as a tool with disclosure (research, brainstorming, editing assistance)
- AI encouraged (learning how to prompt effectively, understanding AI limitations)
Disclosure requirements
When students use AI, what must they disclose? We recommend requiring students to note what tools they used and how they used them. This normalizes transparency rather than hiding.
Teacher guidance
Teachers need to know how to design assignments that work in an AI world. Some suggestions:
- Focus on process, not just product. Ask for drafts, outlines, reflections.
- Design assessments that are harder to fake - in-class work, oral presentations, personal reflection.
- Be explicit in assignment instructions about what AI use is permitted.
Consequences
What happens when someone violates the policy? The answer should probably be different for first-time versus repeated violations, and should focus on learning rather than punishment.
Tool approval process
How do you decide which AI tools are acceptable for classroom use? You need a vetting process that considers privacy, age-appropriateness, and educational value.
A word on detection tools
I’m going to be blunt here. AI detection tools are unreliable. They produce false positives (flagging human writing as AI-generated) and false negatives (missing obvious AI content). Studies consistently show accuracy drops significantly after simple paraphrasing.
Accusing a student of AI cheating based solely on a detection score is unfair and puts you in a difficult position if challenged.
Better to design your policy around transparency and appropriate use than around catching violations.
Implementation matters as much as policy
A policy document nobody reads accomplishes nothing. You need:
- Teacher training so everyone understands and applies the policy consistently
- Student education so they know what’s expected
- Parent communication so families understand your approach
- Regular review as technology and best practices evolve
Getting started
Don’t try to create the perfect policy on your first attempt. Start with something reasonable, communicate it clearly, and plan to revise based on experience.
What matters is having explicit guidelines that reduce confusion. Perfection isn’t the goal - clarity is.
If you’re a school leader in St. Louis and want help developing your AI policy, we’d be happy to talk. This is exactly the kind of work we do with ISSL schools.