Schools that try to write a 40-page AI policy never finish. Schools that write a one-pager adopt it, communicate it, and revise it. Below is the one-page template we've watched dozens of schools adopt — plus the longer policy points to bolt on if you need them. Free to copy, adapt, and use.
The principles a good school AI policy is built on
Before you write a single rule, agree on five principles. They become the test for every rule you write later.
- Student safety is non-negotiable. Any AI use that risks privacy, dignity, or age-appropriate exposure fails the policy.
- Teachers are accountable, not AI. AI is a draft tool, not a decision-maker. The human author owns the output.
- Transparency by default. AI use is disclosed, not hidden. Students disclose. Teachers disclose. The school discloses.
- Equity matters. AI use should narrow gaps, not widen them. In-school AI is provided so home access isn't the deciding factor.
- Learning is the goal, not productivity. AI that produces an answer for a student isn't always a win. AI that helps a student think is.
Every rule below maps to one of these.
The one-page AI policy template
Copy this. Edit the bracketed sections. Distribute.
[School Name] AI Policy — Version 1.0
Effective: [Date] | Owner: [Senior leader name] | Review date: [6 months from effective]
1. Scope
This policy covers the use of generative AI tools by students, teachers, and staff of [School Name], on school devices, school networks, or in school work.
2. Approved AI tools
The following AI tools are approved for use:
- Student-facing (under-13): [e.g. Askie for Schools] only. No other AI tools may be used by under-13 students in school settings.
- Student-facing (13+): [e.g. Askie for Schools, school-licensed Google Gemini for Education, Microsoft Copilot for Education].
- Teacher-facing: [e.g. MagicSchool, Brisk Teaching, Diffit, ChatGPT paid tier with chat history disabled].
Any AI tool not on this list must be approved by [Senior leader / IT lead] before use in school work.
3. Student use
Students may use approved AI tools to:
- Practise skills (e.g. maths problems, language conversation).
- Brainstorm and explore ideas.
- Receive explanations of concepts at their level.
- Get tutoring-style help on homework.
Students must not use AI tools to:
- Produce work they then submit as entirely their own without disclosure.
- Generate content that is inappropriate, harmful, or against school behaviour rules.
- Bypass content controls or impersonate another user.
Disclosure rule: When AI has helped a student produce work, the student writes "AI-assisted" on the work and briefly notes how. Honest disclosure is never penalised. Hidden AI use is treated as academic dishonesty.
4. Teacher use
Teachers may use approved AI tools to:
- Draft lesson plans, rubrics, and parent communications.
- Differentiate texts and tasks.
- Generate practice questions and worked examples.
- Assist with administrative work (reports, summaries, emails).
Teachers must not:
- Paste identifiable student information into a consumer AI tool. Use only school-licensed tools that have signed a Data Processing Agreement.
- Use AI to draft sensitive student records (IEPs, safeguarding notes, behaviour incident reports) without thorough human review and rewrite.
- Use AI as the sole basis for any high-stakes decision about a student.
Accountability rule: Teachers are the accountable author of any AI-assisted output they share with parents, students, or administrators.
5. Privacy
- Approved AI tools have signed a Data Processing Agreement with [School Name].
- Student data is not used to train third-party AI models.
- Students under 13 use only AI tools that are COPPA-compliant (or equivalent local regulation).
- Parents may request to see what their child has done with school AI tools by contacting [Named contact].
6. Academic honesty
- Disclosed AI use is acceptable on most assignments.
- Some assignments will be designated "no AI" — these are clearly marked. Using AI on a no-AI assignment is a breach of academic honesty.
- Some assignments will be designated "AI-allowed with disclosure" — students note how AI was used.
- The school does not rely on AI-detection software as a sole basis for any academic-honesty finding (see Section 9 of our AI detection in schools guide).
7. Equity
The school provides AI access during school hours so that students without home AI access are not disadvantaged. AI-assisted assignments may not require home AI use.
8. Reporting concerns
Any student, parent, or teacher who has a concern about AI use — safety, content, fairness, privacy — should contact [Named contact] within 5 school days.
9. Policy review
This policy is reviewed every [6 months] by [committee name]. Suggested revisions can be sent to [email] at any time.
End of one-page policy.
Bolt-on sections (if you need more depth)
The one-pager is enough for most schools to start. Some contexts need more — district mandates, board requirements, complex consortia. Bolt on as needed.
Bolt-on A: Detailed acceptable-use examples
For each year group, give 3–5 concrete examples of approved and disallowed use. Specifics calm parents and clarify expectations more than principles do.
Year 5 example — approved: "Asking the school AI to give you three different ways to explain fractions of an amount, then trying each yourself."
Year 5 example — disallowed: "Asking the school AI to write your story for you, then copying it into your book."
Bolt-on B: Year-group differentiation
If you're a K–12 school, your policy needs to read differently for different year groups. The simplest structure: one column per year band (K–2, 3–5, 6–8, 9–12), each saying what's approved.
Bolt-on C: Staff training requirements
If your policy mandates teacher AI literacy, say what training is required and how often.
"All teachers complete the [hour-long] AI in Teaching training annually. New staff complete it within their first half-term."
Bolt-on D: SEND / IEP-specific provisions
AI is often more transformative for students with learning differences than for any other group. A bolt-on section can clarify how AI is used in IEP plans, with what safeguards. See AI for special education for the full discussion.
Bolt-on E: Generative-AI image and voice use
If your AI tool produces images or uses voice (as many do, including Askie for Schools), add a paragraph on what's appropriate: no images of identifiable individuals, no voice cloning of students or staff, etc.
Bolt-on F: Parent communication plan
How will parents be informed about the policy? How will updates be communicated? Who do they contact with questions? A four-line plan works fine here.
Bolt-on G: Incident response
What happens if a student is exposed to inappropriate AI content? If an AI tool's terms change? If a vendor reports a breach? A short incident-response paragraph keeps everyone aligned when something goes wrong.
How to actually adopt this policy
Three steps. Boringly effective.
Step 1: Run the policy past the people who'll have to live with it
A 30-minute meeting with two teachers, one parent representative, your IT lead, and a senior leader. Read the policy. Ask: "What's missing? What's unclear? What won't work in your classroom?" Edit accordingly.
Step 2: Communicate it once, properly
A staff briefing. A short letter home to parents. A short assembly for students. Same message in three forms. Done badly, this takes a fortnight; done well, it takes a Wednesday.
Step 3: Schedule the review now
Put the review date in everyone's calendar before you adopt. Policies that don't get reviewed get ignored. A six-month review keeps the policy alive.
What this template doesn't replace
It doesn't replace:
- A signed Data Processing Agreement with each vendor. The policy says you have these — you actually need them.
- Teacher training. A policy that mandates AI literacy without providing training is just paperwork.
- A pilot programme before any new tool goes school-wide. See how schools can pilot AI.
- A named contact with actual decision-making authority when something goes wrong.
The policy is the cheap part. The expensive part is the procurement, training, and review that make the policy real.
Frequently asked questions
Can we use this AI policy template at our school?
Yes — it's free to adapt and use. Replace bracketed sections with your specifics. If you want to credit Askie, link to Askie for Schools; not required.
Does this policy comply with COPPA, FERPA, or GDPR?
The template is structured to be compliant when adopted alongside signed Data Processing Agreements with each vendor and appropriate vendor selection. The policy itself doesn't create compliance — it codifies how compliance is enforced.
How often should a school AI policy be reviewed?
Every 6 months in 2026 — the technology is moving fast enough that 12-month review cycles leave you out of date. Once the field stabilises, annual review is fine.
Who should own the school AI policy?
A named senior leader. Not "the IT department." Not "the SLT." A named person who is accountable for the policy being current, communicated, and enforced.
What if our district already has an AI policy?
Use this as a school-level implementation document underneath the district policy. The district policy says what; your school document says how, here, with these tools.
Want an AI platform that already fits inside a sensible school policy? Askie for Schools is built around the principles in this template — teacher visibility, COPPA-aligned safety, transparent parent access, age-calibrated content. Start your pilot →