Even though all tests are in the fields ‘Mathematics’ and ‘Science’, they do not necessarily
test the same cognitive skill: The IEA tests are related to common elements of school
curricula across countries while IAEP is geared towards the curriculum in USA building on
the national testing procedures developed by the National Assessment of Education Progress
NAEP. The OECD PISA test has a more real-world approach and claims to assess the skills
that are considered to be essential for full participation in the society. These differences do
not, however, seem to be very important with respect to measured student performance. For
example, the correlation coefficient between the test results for the 18 countries participating
both in TIMSS 2003 and PISA 2003 is 0.94.6
Table 1. Data sources description
Year |
Test |
Acronym |
Test subjects |
Test age |
Countries |
Data source |
1980-81 |
IEA |
SIMS |
Mathematics |
13 years |
3 in 1980 14 in 1981 |
Lee and Barro (1997) Travers and Westbury (1989) |
11 in 1983 | ||||||
1983-85 |
IEA |
SISS |
Science |
14 years |
11 in 1984 1 in 1985 |
Postlethwaite and Wiley (1992) |
1990-91 |
IAEP |
IAEP |
Mathematics |
13 years |
2 in 1990 17 in 1991 |
Lee and Barro (1997) |
1994-95 |
IEA |
TIMSS |
Mathematics |
Grade 8 |
4 in 1994 36 in 1995 |
timss.bc.edu/ |
1998-99 |
IEA |
TIMSS-repeat |
Mathematics |
Grade 8 |
6 in 1998 31 in 1999 |
timss.bc.edu/ |
2000-02 |
OECD |
PISA 2000 |
Mathematics |
15 years |
32 in 2000 9 in 2002 |
www.pisa.oecd.org |
2002-03 |
IEA |
TIMSS 2003 |
Mathematics |
Grade 8 |
7 in 2002 38 in 2003 |
timss.bc.edu/ |
2003 |
OECD |
PISA 2003 |
Mathematics |
15 years |
40 in 2003 |
www.pisa.oecd.org |
Note. For some countries separate scores are reported for different parts of the country. We have calculated mean
country averages by using population as weight. IEA (except the 1983/84 test) and IAEP tests are conducted in
the fall in the southern hemisphere and in the spring in the northern hemisphere. PISA 2000 originally only
included five non-OECD countries, but nine additionally non-OECD countries conducted the same test in 2002.
Recently, it has become common to report national averages based on Items Response Theory
which weights the different questions by their difficulty (“Warm estimates”, Warm, 1989),
and standardizes the scores such that the average across all students across countries
participating is 500 with standard deviation of 100. Particularly the PISA studies employ this
6 The correlation coefficient between the average Science and Mathematics score in TIMSS-repeat 1999 and
PISA 2000 is 0.87 and for IAEP 1991 and TIMSS 1995 the correlation coefficient is 0.80. The correlation
coefficients are calculated using the adjusted test score described below. Interestingly, as can be seen from
Figure 2 below, USA has its poorest performance in the IAEP test that was based on the US curriculum.