Stellenbosch Working Paper Series No. WP05/2016

School examination results are far from ideal measures of progress in schooling systems, yet if analysed with sufficient care these data, which are common in education systems, can serve this purpose. The paper partly deals with how various student selection and year-on-year comparability issues in examinations data can be dealt with. This is demonstrated using South African student-level results, aggregated to the school level, for Grade 12 mathematics in the years 2005 to 2013. This was a period during which provincial boundaries changed, creating a quasi-experiment which is amenable to impact evaluation techniques. Value-added school production functions and fixed effects models are used to establish that movement into a better performing province was associated with large student performance improvements, equal in magnitude to around a year’s worth of progress in a fast improving country. Improvements were not always immediate, however, and the data seem to confirm that substantial gains are only achieved after several years, after students have been exposed to many grades of better teaching. The institutional factors which might explain the improvements are discussed. Spending per student was clearly not a significant explanatory variable. What did seem to matter was more efficient use of non-personnel funds by the authorities, with a special focus on educational materials, the brokering of pacts between stakeholders, including teacher unions, schools and communities, and better monitoring and support by the district office. Moreover, the education department in one province in question, Gauteng, has for many years pursued an approach which is unusual in the South African context, of hiring a substantial number of senior managers within the bureaucracy on fixed term contracts, as opposed to on a permanent basis, the aim being to improve accountability and flexibility at the senior management level.

JEL Classification:
C21, H11, I21

South Africa, school improvement, mathematics education, impact evaluation