← Policy tracker Research · Policy

England: Data Protection Impact Assessments for School AI Systems (2026)

policydata-protectionprivacyai-governance

By Q1 2026, all schools in England are required to conduct Data Protection Impact Assessments (DPIAs) for any AI tool they plan to use, with support from the Department for Education and Information Commissioner’s Office (ICO).

By mid-2026, 100% of schools must complete an AI data protection checklist. The framework forbids inputting personally identifiable student data into open AI platforms and requires that any closed/proprietary AI system used has robust privacy safeguards. The ICO will conduct random audits of a sample of schools’ AI data practices by the end of 2026 for compliance verification.

Who it affects: All schools in England; IT and safeguarding leadership; data protection officers.

What is notably missing: Details on what constitutes “robust privacy safeguards” for proprietary systems; guidance on remediation if schools fail audits; clarity on liability if breaches occur despite compliance efforts; specification of technical standards for data handling.