response to a query by another member (in Hansard - see
http://www.publications.parliament.uk/pa/ld200506/ldhansrd/vo050620/text/50620w0
2.htm). On being shown that the same relationship held for all secondary schools in
England, the reply by Lord Adonis was not re-addressed, and the findings were
simply ignored.
Making the official value-added analysis more complex, via the addition of contextual
information about the pupils, may disguise but will not solve the problem highlighted
in this paper. The additional complexity will reduce further the number of potential
critics able to understand the methods. The use of additional information about the
social background of pupils is likely to decrease the scatter shown in Figure 1, making
the relationship between school intakes and school outcomes stronger. The inclusion
of social background information in school performance figures will also have the
unintended consequence that we will no longer be able to consider the extent to which
schools do, or do not, compensate for differences in those backgrounds. At present, if
VA worked, we could see whether schools were equally effective for rich and poor
pupils, by disaggregating the VA by eligibility for free school meals (FSM), for
example. Using contextualised VA with FSM factored into the calculation, it does not
make sense any longer to disaggregate by FSM. And the same point can be made
about ethnicity, language, and special need. Plans to make value-added analysis more
complex, through the use of advanced regression techniques will also be
counterproductive. They will also reduce further the number of potential critics able
to understand the methods, but can not overcome the problem of correlation noted
here.
The percentage variation at school level, usually termed the ‘school effect’, is small
and suggests incorrectly that schools are making little difference to their pupils. There
is also the confusing situation that the same school may appear to be effective on one
measure (such as attainment) but not another (such as dropout), or effective for one
age group and not another. Therefore policies based on VA results and designed to
improve test performance for one age group can hurt performance in other areas
(Rumberger and Palardy 2005). The solution to all of these issues is not a more
complex value-added analysis. The solution lies in re-thinking what it is that we want
value-added analysis to achieve. There are simpler and more scientific alternatives to
measuring the impact of schools. One alternative suggested recently requires no
consideration of prior attainment or contextual variables, relying instead on the
discontinuity of school years or grades to estimate the absolute effect of going to a
school in comparison to not going to school at all (Luyten 2006).
The second kind of conclusion from this paper is more practical. Until concerns about
value-added analyses have been resolved, it is not reasonable to use them for practical
purposes.2 Parents cannot rely on them when choosing schools. School leaders cannot
rely on them to judge the effectiveness of teachers or departments, and officials
cannot rely on them to make decisions about the quality of education delivered in
schools. Rather, what this re-analysis shows is that schools with a low-attaining pupil
2 Clearly, there will never be an ideal measure able perfectly to summarise the performance of a school.
That is not the point. If accepted, what this paper shows is the DfES approach is nothing like a solution
to the problem of measuring pupil progress independently of their raw-score attainment. It is neither
good enough, nor even the best approach currently available.