How Does Ofsted Inspect The Use of AI
Share
What School Leaders Need to Know About Ofsted's Approach to AI
Artificial intelligence is increasingly present in schools, from pupils using ChatGPT for homework to administrative tools that help with attendance tracking. But how will Ofsted look at AI during inspections? In this blog, we share some of the key questions school leaders are asking.
Will Ofsted specifically inspect our use of AI?
No. Ofsted does not evaluate AI as a stand-alone element of inspections, and inspectors will not directly assess any AI tools you use. However, inspectors may consider the impact that AI use has on outcomes and experiences for children and learners. This includes both how your school uses AI and how you respond to AI use by pupils, parents and staff.
Are we required to use AI in a particular way?
No. Ofsted does not expect or require schools to use AI in any specific way, or to use it at all. At this stage, Ofsted acknowledges it doesn't have the evidence needed to define what constitutes "good use" of AI for inspection purposes.
However, you should be aware that pupils and staff are likely already using AI in connection with education. Whether this is pupils using AI tools for homework or support staff using AI for administrative tasks, it's important that you're aware of where AI is being used, how it's being used and the impact this is having.
Will inspectors ask about AI during our inspection?
Inspectors are not required to actively look for AI use, and inspection reports won't mention AI unless it's crucial to broader inspection decisions. Inspectors will only consider AI when it's relevant to their evaluations in areas like safeguarding, curriculum or attendance management.
If inspectors find that your school's use of, or response to, AI has a significant impact on children and learners, they'll record it as evidence in the same way they would any other factor.
What if pupils are using AI for homework or coursework?
Where relevant, inspectors will explore how leaders ensure that AI use by pupils at school or at home is in their best interests. This might include looking at your school's policy on pupils using AI for homework and coursework.
The key question is whether you've made sensible decisions about managing this use, just as you would with any other educational tool.
What about the risks associated with AI?
While Ofsted won't evaluate AI risks separately, these concerns will be addressed when they have implications for areas already considered during inspection:
Data protection: Many AI applications use large amounts of data, including personal data. Inspectors may ask questions similar to those they ask about any data collection, storage and processing.
Safeguarding: AI can pose unique safeguarding risks. Inspectors will consider this as part of their general evaluation of your safeguarding culture. For example, they may look at how you ensure pupils' use of AI on personal devices is in their best interests.
Bias and discrimination: AI can perpetuate bias present in the data it processes. While this is a new context, inspectors already consider risks of bias and discrimination and the measures you've taken to mitigate them.
Inspectors may ask what steps you've taken to ensure your use of AI properly considers these risks and whether you have assurance processes to identify emerging risks.
Do inspectors need to understand how AI software works?
No. Ofsted doesn't expect inspectors to understand how AI software has been designed, any more than they need to understand the programming of your firewall software to evaluate data security arrangements.
The evaluation focuses on your decision-making, what you've considered, and the impacts on children and learners, not the technical details of the tool itself.
Q: Can you give examples of how inspectors might consider AI?
Here are some practical scenarios:
- If your school uses AI to identify causes of absence, inspectors may consider how it forms part of your overall approach to tackling absence.
- If you use AI-generated summaries of meetings or reports, inspectors may want to understand how you ensure these are accurate.
- If pupils are using AI inappropriately, inspectors may evaluate how you've responded and addressed the impact.
If you're wondering how you could consider the use of AI in your setting, we share five areas you may wish to look into here.
What if we don't use AI at all?
That's absolutely fine. Many schools and pupils don't use AI, and Ofsted has no expectation that you should. The guidance simply clarifies how inspectors will approach AI if they do encounter it during an inspection.
More information on how schools could approach AI can be found here.
Key Takeaways for School Leaders
✓ You don't need to use AI, but be aware that pupils and staff probably are
✓ Make sensible decisions about AI use, just as you would with any other tool
✓ Consider the impact on pupils' outcomes and experiences
✓ Address the risks—particularly around data protection, safeguarding and bias
✓ Have policies in place for pupil use of AI in homework and coursework
✓ Document your decision-making about AI use and how you've considered risks
Looking Ahead
Ofsted has stated that this is not a final position and that their approach will be updated as understanding of AI use in education continues to develop. School leaders should stay informed but needn't feel pressured to radically change their approach. The same principles of sound decision-making and keeping pupils' best interests at the centre apply, whether you're using AI or not.
This free editable download offers sample reflection questions across key areas of school life, plus prompts to help you think more widely about the ethical use of AI in education.

