A better way to evaluate schools
Value-added measures of performance look beyond basic exam results, helping give a much clearer view of how schools perform within their regional and social context.
Since 2001 Bristol has helped to change the way that countries around the world think about evaluating schools. The research has generated new knowledge about performance measures and the context that promotes student learning.
These insights have given policy makers, education practitioners, NGOs and the public a much clearer view of how schools perform within their regional and social context. The research is also helping other countries to move beyond previously limited and much-criticised league table systems based on crude raw scores.
The work carried out by the Graduate School of Education (GSoE) focused on identifying new ‘value added’ measures of performance that look beyond basic exam results.
"The long standing problem with league tables based on crude raw exam scores is that they tell you very little about standards based on the progress of students over time or where a school might be heading. What GSoE research has demonstrated is that a school’s effectiveness is best seen as something that is context- and time-specific. For example, we’ve looked at the impact of national and regional differences, as well as other factors such as socio-economic context and pupils moving schools. In the process, we’ve shown that raw school league tables will always have limitations as guides to school evaluation or choice." - Professor Sally Thomas, Professor in Education.
Influencing national and international policy
This key development in understanding is having a major impact on government thinking both in the UK and overseas. In the UK it influenced key national policies including school self-evaluation, the Pupil Level Annual Schools Census (PLASC) and the introduction of value-added measures of school performance by the Department for Children, School and Families (DCSF) in 2006.
Other countries including China and a number in Africa have carried out school evaluation studies using value-added techniques. The University has also been invited to advise Australian Government officials, the Chilean Ministry of Education, and the EU education conference for the French Presidency.
Developing and sharing analytical techniques
To ensure that word continues to spread, the GSoE also created evaluation tools that other institutions can use.
"A lot of our work has involved creating new and detailed datasets that we’ve used to measure and evaluate school performance in innovative ways," explains Professor Thomas. "In the process we’ve developed new analytical techniques. We wanted to make sure this learning is passed on to both academic and non-academic audiences."
To do so, Professor Thomas has worked with Professor Harvey Goldstein, Dr George Leckie and other colleagues from the Centres for Assessment and Evaluation Research in Education and for Multilevel Modelling. They have utilised and applied methods and software for statistical modelling known as MLwiN, developed by the Centre for Multilevel Modelling. The software enables rigorous quantitative analysis of international educational datasets and provides evaluation tools that are of real practical use.
This additional layer of functionality further enhanced the world’s understanding of how schools perform. For example, in 2010 the UK Department for Education (DfE)’s statisticians used MLwiN to calculate value-added school performance measures as part of the OFSTED school inspection process.
The DfE also used MLwiN to construct the Learning Achievement Tracker, a tool for schools and colleges to appreciate progress made by students following compulsory schooling. Other non-academic MLwiN users include Statistics Canada, Statistics Norway, the Netherlands Bureau of Statistics, UNESCO and the World Health Organisation.
"In using value added approaches to evaluate different aspects of school and educational effectiveness, we wanted to make sure that the insight we’ve developed can be used by academics who want to further research in this area. And just as importantly, we wanted to make sure it can be used by education specialists involved in implementing school evaluation on the ground," says Professor Thomas.
Related research groups