Complete guide to K-12 AI governance
Everything you need to implement AI governance at your school. From policy development to training to enforcement to ongoing review.
Last updated: January 2025
In this guide
1. Why AI governance matters now
Your students are already using AI. A 2024 survey found that 68% of high school students have used ChatGPT for schoolwork. The question isn't whether AI is in your school - it's whether you're guiding how it's used.
Without clear governance:
- Teachers apply inconsistent standards (one class allows AI, another treats it as cheating)
- Students are confused about expectations
- Staff may inadvertently expose sensitive data to AI tools
- Parents don't know what to expect or how to support their children
Good governance doesn't mean banning AI. It means creating clarity so everyone can make informed decisions.
2. The governance framework
Effective AI governance has four pillars:
Pillar 1: Clear policy
Written guidelines that define acceptable use for students, teachers, and staff. The policy should be specific enough to be actionable but flexible enough to adapt as technology evolves.
Pillar 2: Training
Everyone needs to understand the policy and how to apply it. This includes teachers (how to design appropriate assignments), students (when AI is and isn't acceptable), and parents (how to support learning at home).
Pillar 3: Enforcement
Consequences for policy violations, focused on education rather than punishment. Also includes detection strategies that don't rely on unreliable AI detection tools.
Pillar 4: Review
Annual review process to update policies as technology and best practices evolve. AI capabilities change rapidly - your governance should too.
3. Developing your AI policy
Who should be involved
Policy development should include:
- Academic leadership (curriculum implications)
- IT (technical feasibility, tool evaluation)
- Faculty representatives (classroom implementation)
- Legal counsel (liability, compliance)
- Student representatives (for high schools)
What the policy should cover
For students:
- When AI use is prohibited (tests, original creative work)
- When AI use is permitted with disclosure
- When AI use is encouraged (learning activities)
- How to disclose AI use
- Consequences for violations
For teachers:
- How to communicate AI expectations on assignments
- Approved AI tools for professional use
- Student data considerations when using AI
- How to design AI-appropriate assessments
For staff:
- Approved AI tools for workplace tasks
- Confidentiality requirements
- Data handling procedures
The disclosure requirement
We recommend requiring disclosure rather than detection. Students should note:
- Which AI tools they used
- How they used them (brainstorming, editing, generating content)
- What portions were AI-generated vs. original
This approach normalizes transparency instead of hiding AI use.
4. Training your community
Teacher training
Teachers need practical training on:
- Understanding what current AI can and can't do
- Designing assignments that work in an AI world
- Communicating expectations clearly to students
- Addressing suspected policy violations
- Using AI appropriately for their own work
Recommendation: 2-hour workshop at start of year, with monthly 15-minute updates
Student training
Students need to understand:
- What the policy requires
- Why disclosure matters
- How to use AI tools productively (not just to get answers)
- AI limitations and how to verify information
Recommendation: Age-appropriate training integrated into existing tech literacy curriculum
Parent communication
Parents should know:
- Your school's AI policy and philosophy
- What their children are learning about AI
- How to support appropriate AI use at home
- Who to contact with questions
Recommendation: Clear communication at beginning of year, with FAQ available
5. Enforcement that works
Why detection tools fail
AI detection tools (GPTZero, Turnitin's AI detection, etc.) are unreliable. Studies consistently show:
- High false positive rates (flagging human writing as AI)
- Easy to defeat with simple paraphrasing
- Bias against non-native English speakers
Don't base accusations solely on detection scores. The reputational and legal risk isn't worth it.
Better approaches
Process-based assessment: Require outlines, drafts, and revision notes. It's harder to fake a documented process than a final product.
In-class components: Include some assessed work done in class where you can observe the process.
Oral defense: Have students explain and defend their work. Students who understand their work can discuss it; students who submitted AI output often can't.
Personal reflection: Require personal connection or reflection that's harder to generate convincingly.
When violations occur
Focus on education rather than punishment:
- First violation: Conversation about why it matters, redo assignment
- Repeated violations: Escalating consequences, parent involvement
- Pattern of behavior: Academic integrity process
The goal is teaching responsible AI use, not catching cheaters.
6. Evaluating and approving AI tools
Before any AI tool is used with students, evaluate:
Privacy and compliance
- What data does the tool collect?
- Is it FERPA compliant?
- Will the vendor sign a Data Privacy Agreement?
- Is data used for model training?
Educational value
- Does this serve a genuine educational purpose?
- How does it align with learning objectives?
- What training do teachers need?
Practical considerations
- Does it integrate with existing systems?
- What is the total cost?
- What happens to data when we stop using it?
Maintain lists of approved, restricted, and prohibited tools. Update regularly.
7. Annual review process
AI capabilities change rapidly. Your governance should evolve too.
Annual review should include:
- Review of policy effectiveness (what's working, what isn't)
- Survey of teacher experiences
- Update to approved tools list
- Refresh of training materials
- Scan of regulatory changes
- Benchmark against peer schools
Trigger events for mid-year review:
- Major new AI capabilities released
- Significant incident at your school
- Regulatory changes
- New guidance from accrediting bodies
8. Implementation checklist
Getting started (do this month)
- Form AI governance committee with key stakeholders
- Audit current AI tool usage across the school
- Download and customize our policy template
Foundation (this quarter)
- Finalize and approve AI acceptable use policy
- Establish approved/prohibited tools lists
- Conduct initial teacher training
- Communicate policy to parents
Ongoing
- Monthly check-ins on policy effectiveness
- Quarterly review of approved tools
- Annual comprehensive policy review
Related resources
Need help implementing AI governance?
This guide is designed to be self-serve. If you want expert help developing your policy, training your staff, or setting up governance processes, we're here.
Talk to an expert