Skip to content

Self-evaluation and improvement planning

Cross-cutting themes 2023 - 2024


Across all sectors in the 2023-2024 academic year, the majority of providers evaluated their work appropriately, while a minority did not do so well enough.

Although there are common features that underpin effective self-evaluation and improvement planning, the context of the provision is also key. For example, non-maintained settings vary greatly across Wales. They can be very small entities, providing early years education from village or church halls that need to pack away all resources at the end of every day, to large private businesses employing dozens of staff. Leaders and practitioners working in the non-maintained sector are diverse and their level of expertise and understanding of leadership and management is varied.

On the whole, many leaders evaluate their non-maintained settings accurately. In these instances, their self-evaluation processes inform their improvement plans purposefully. Their planned actions are reasonable and realistic, and in most cases bring about the intended results. In a few settings, leaders struggle to identify the most important areas for improvement. They prioritise areas of interest to them, rather than those most in need of improvement.

Where self-evaluation was effective in schools and PRUs, leaders at all levels gathered first-hand evidence from a wide range of sources and nearly all staff were able to contribute meaningfully to these processes. Often, governors or the equivalent managerial body were involved appropriately as well. Typically, sources of evidence included observing teaching, looking at learners’ work and gathering the views of staff, learners, parents and other relevant stakeholders. Leaders also carried out relevant analyses of a range of data to give a rigorous and rounded picture of provision and learner progress. Leaders ensured that they evaluated all aspects of their provision, including, for example, the impact of professional learning on teaching and learning as well as the impact of how grants were spent. In a few instances, providers used external partners, such as school improvement advisers or senior leaders from other providers, to validate and contribute to their own findings.

Within the local government education services sector, self-evaluation and planning for improvement are areas identified for improvement within all recent inspections. In 2023-2024, all local authorities inspected received a recommendation to strengthen aspects of their evaluation and improvement processes. This was also the case in 2022-2023. Improvement plans often did not set out clear success criteria to help officers consider the impact of actions on improving outcomes and provision for children and young people. Although officers monitored whether actions had been completed, they did not always evaluate the impact of their work well enough. This made it difficult to identify strengths and areas for improvement precisely enough and led to variation in the quality of self-evaluation and planning for improvement within local authorities.

Across all the sectors we inspect, we found that, where self-evaluation was embedded within the provider’s culture, leaders planned a coherent programme of monitoring, evaluating and reviewing activities well in advance and shared this with all staff. The evidence gathered from effective self-evaluation processes enabled leaders at all levels to have a secure understanding of the progress learners were making. This information was used strategically to allocate resources, for example to support aspects of learners’ skills which needed targeted support to improve.

In the very best examples in the schools and PRUs sectors, leaders judged teaching by its impact on learning and teachers were reflective, evaluating their own sessions regularly and refining their teaching as a result to ensure that learners made good progress. Staff welcomed the scrutiny arising from an evaluative culture. They were open and keen to innovate and improve.

Where the provider was a partnership of organisations, effective self-evaluation included having consistent processes across the different centres and a clear mechanism for bringing this information together coherently to give an accurate and shared overview and understanding of provision and standards.

Overall, strong self-evaluation enabled leaders to know their organisation’s strengths and areas for improvement well. In many cases, there was a clear link between the findings from self-evaluation, the priorities for improvement and the programme of universal and targeted professional learning to support progress towards these priorities. However, it does not always follow that effective self-evaluation ensures good improvement planning and vice versa. In a few providers, leaders evaluated thoroughly and accurately but did not make effective use of this information to tackle weaknesses in provision well enough. Conversely, a few providers had robust plans for improvement, which were having a beneficial impact but weren’t focused on the aspects of provision that needed the most urgent improvement because self-evaluation was not effective enough.

Effective practice: Adamsdown Primary School

How Adamsdown Primary used distributed leadership to support successful self-evaluation processes and implement whole-school changes | Estyn (gov.wales)

In order to sustain recent improvements made, the headteacher needed to review the school’s vision with all stakeholders. This included embedding a programme of monitoring, evaluating and reporting to ensure that all stakeholders were aware of the school’s baseline of standards across all areas of school life, and its priorities leading up to implementing the new curriculum. A timeline was then developed to bring the vision into action. Through a series of whole-school training days, the school’s stakeholders developed its aims for the next three to five years. The school vision was re-written to accurately reflect the diverse needs of learners. This process included leaders, governors, staff, pupils, parents and community links.

To support the leadership team in enacting the vision, an innovative trial to introduce a new system for grouping pupils was carried out in 2017. This was led by the assistant headteacher and teachers on the upper pay scale. Data produced from pupil progress meetings was analysed and showed that pupils made accelerated progress in the trial. Teaching and Learning Responsibility (TLR) holders analysed this data across the core subjects of maths, English and science. Supported by main pay scale teachers who led foundation subject areas, all aspects of the curriculum were reviewed.

As mentioned, across all sectors, the majority of providers planned effectively for improvements and a minority did not plan well enough. Where improvement planning was effective, leaders identified a small number of important priorities from self-evaluation findings and worked on these alongside national priorities, such as attendance, Welsh, the curriculum and ALN reform, and mitigating the impact of poverty on well-being and attainment. Leaders applied national priorities sensibly to their own contexts. They ensured that plans were developed collaboratively, giving middle leaders, other staff and governors a sense of shared ownership. These plans included clear lines of responsibility and accountability at different levels of leadership. There were realistic timescales for achieving precise targets and an understanding by all of what success would look like and how they would evaluate or measure it.

Leaders allocated appropriate resources to support these plans and, where relevant, provided the professional learning that staff needed to improve their practice as an integral part of the process. The providers’ performance management arrangements were linked to its priorities. In the best examples, improvement planning was a continuous process in which staff revisited, reviewed and amended plans at regular intervals. Leaders ensured that progress towards priorities was shared with staff and was often an important focus in meetings. Above all, effective planning had a demonstrably positive impact on the quality of teaching and on learners’ progress that was understood by all staff.

Effective practice: Pontarddulais Comprehensive School

School Improvement – How an inclusive cycle of school improvement processes continually improves provision and pupil outcomes. | Estyn (gov.wales)

A distinctive feature of the school improvement cycle is the annual ‘School Improvement Launch’, a collaborative session involving staff, governors, and pupil representatives. This inclusive process ensures that diverse perspectives are considered, fostering shared ownership of strategic priorities. This session shapes the School Development Plan (SDP), a dynamic tool guiding the entire school community towards shared goals.

The SDP triggers the planning phase of the school improvement cycle, which includes Area Development Plans (ADPs) that are similar in style and content to the SDP, though they are also designed to serve their context at an area/subject level. In turn, performance management objectives are natural outcomes of the SDP and ADPs. Aligning these processes ensures synergy and collegiate responsibility for school improvement. The SDP is RAG-rated by the Extended Headship Team, which includes senior and middle leaders, and regularly scrutinised by governors, ensuring a clear understanding of progress and areas that require additional attention. Members of the Extended Headship Team lead on individual strategies, providing a continuous feedback loop within fortnightly link meetings.

Across Wales, a minority of schools and PRUs were given a recommendation following inspection to improve their self-evaluation and improvement planning. These recommendations highlighted three main areas of weakness. Firstly, in a few providers, leaders did not focus self-evaluation well enough on identifying strengths and areas for improvement in teaching. Secondly, when evaluating teaching, a few providers did not consider its impact on learning. Thirdly, in a few providers, leaders did not focus their improvement planning on the aspects of provision that needed the most urgent improvement. In a very few providers, there were other specific weaknesses, such as not gathering the views of learners or excluding governors from self-evaluation and improvement planning processes. Also, in a very few providers, there were broad weaknesses with these processes, which were either too bureaucratic and time-consuming, or lacked rigour, or were too piecemeal to be useful, or were at an early stage of development.

Unsurprisingly, providers that go into a statutory category nearly always have recommendations to strengthen their self-evaluation and improvement planning processes since the inability of leaders to identify important weaknesses and plan effectively for improvements are highly likely to be factors that are causing them to perform badly. It is, therefore, often the case that providers across sectors that come out of statutory categories have made significant changes to their processes to make them more collaborative, rigorous and transparent. In schools and PRUs, for example, leaders have ensured that the focus of self-evaluation is on the quality of teaching and its impact on learners’ progress. They have then ensured that improvement planning focuses precisely on those aspects of teaching and learning requiring most attention and that there is a coherent programme of professional learning to support this. Here are two examples of providers that have been removed from statutory categories in 2023-2024 in this way:

Monitoring report St John Lloyd Catholic Comprehensive School 2023 (gov.wales)

Monitoring report Dewstow Primary School 2024 (gov.wales)


Questions for self-reflection

  • What are the best sources of evidence for evaluating any specific aspect of our provision?
  • How might we involve learners, staff, parents, governors and the wider community meaningfully in evaluating what we do and planning for improvement?
  • What is the balance in our self-evaluation work between first-hand evidence and summative outcomes? Do we rely too heavily on one or the other when evaluating provision and learner progress?
  • Are there any important aspects of our provision that we currently do not evaluate? Should we be evaluating them? How often?
  • To what extent do we evaluate our work by its impact on the quality of teaching and the progress that learners make?
  • Overall, how rigorous, honest and accurate is our self-evaluation? How confident are we that we know our strengths and areas for improvement well? How consistent is our self-evaluation at different levels of leadership?
  • To what extent do our priorities for improvement flow naturally from the evaluation of our provision and learner progress?
  • How will we measure whether we have made (enough) progress or not in meeting our improvement targets? Are our success criteria precise and measurable?
  • Are there clear lines of responsibility and accountability within our plans? How realistic are our timescales for achieving our priorities?
  • Have we allocated sufficient resources to achieving our priorities?
  • To what extent do our programme of professional learning and our performance management arrangements support our improvement plans?
  • How manageable and sustainable are our self-evaluation and improvement planning processes for senior and middle leaders and other staff?

There is a wide range of useful self-evaluation resources for providers to use here.