Co-authored with Dr Marina Shapira, University of Stirling

This week has seen the triennial publication of the results of the 2018 PISA survey, including the much awaiting country rankings in reading, mathematics and science. In Scotland, these results have been much anticipated, following the ‘PISA shock’ of 2015.

The 2018 results have shown a modest improvement in the reading score – from 493 in 2015 to 504, effectively bringing the country’s performance back to the levels recorded in 2006, 2009 and 2012 (but below earlier scores in 2000 and 2003). Scores in mathematics and science have remained stagnant since 2005, with marginal declines in performance (from 491-489 in maths and from 497 to 490 in science).

Inevitably, these results have been used to score political points in the immediate run-up to the election. Deputy First Minister and Education Cabinet Secretary John Swinney tweeted:

PISA has its limitations but Scotland’s performance in reading has risen sharply. Just 5 countries are now significantly higher than Scotland. The Scottish Attainment Programme started with an emphasis on literacy – the foundation of so much other learning. That is bearing fruit. (3 December 2019)

The reaction to the results has been more negative in the media. Holyrood Magazine stated that ‘Scotland’s score for reading improved in the latest PISA report, returning to a level similar to 2012 after a drop in 2015, but for maths and science there has been a decline in scores with each PISA survey since 2003 for maths and 2006 for science’. According to the Sun:

Pupils are now performing slightly worse than they did before she [Nicola Sturgeon] started improving our school system. And the figures are clear that, despite the up-tick in reading, performance in maths and science has continued to fall.

The Times has been similarly critical, stating that ‘performance levels in science and maths slipped to a record low in the Pisa test’.

So who is right? What do the PISA results tell us about Scotland? Is there really evidence of decline in standards, and can this be attributed to Curriculum for Excellence and the SNP, as many are claiming?

Prior to discussing the results it is important to note that the PISA study is based on a sample[1] and as such the measures it produces have sampling errors and therefore cannot be automatically generalised towards the entire population of 15 year old students in Scotland. Therefore, prior to describing a change in a PISA score in 2018 (compared to the previous years) as an ‘increase’ or a ‘decrease’, it should be first checked whether the change is statistically significant. ‘Statistical significance’ means that measures estimated from a sample can be generalised for the entire research population. We can never know a true population parameter unless the measurement is based on entire population. Yet, we can estimate the risk of making a mistake if we use the sample estimate. Usually 5% is deemed to be an acceptable level of risk. This means that we can be 95% confident that the estimates are true for the population.

Thus, the alleged ‘decrease’ in Maths and Sciences attainment in Scotland compared with 2015 is not statistically significant. In other words, the difference in numbers falls within the margin of error in this sort of survey, and the best that can be said is that there is no change between 2015 and 2018 in these subjects. Moreover, in international comparative terms, these performances fall pretty much in the middle ground in the league table, suggesting an average performance by Scotland. Reading scores have increased from 2015 to 2018, but the difference is small, and also may be a one off fluctuation. To quote the recent UCL/IoE blog on the UK PISA results:

But hold your horses before getting too excited. One good set of results is NOT a trend! And a swing of this size in PISA can simply be a result of changes in methodology.

Thus, the media hype about Scotland’s decline in maths and science is not especially warranted, as the evidence is actually pretty underwhelming. Conversely, claims about boosts in reading are slightly more credible[2]. While the scores have largely remained stable in recent years, there are some interesting nuances in the data. First, the claim that that only 5 countries (Canada, Estonia, Finland, Ireland and Korea) have better score than Scotland in reading is technically true, given the methodological caveats explained above. And yet Scotland is in a company of another 12 countries which have their reading scores in the same confidence interval (the interval where the true population parameter lies). Perhaps more remarkable is the fact that only three countries have reading scores for boys higher than Scotland (10 countries have reading score for boys within the same confidence interval). However, Scotland is doing less well when reading proficiency levels are considered; quite a few counties with average reading scores lower than in Scotland have a higher proportion of students achieving reading proficiency level 5 or 6 (e.g. England. Slovenia, United States, Australia, Norway, Poland. Israel). Another interesting fact is that the socio-economic inequality gradient (the amount of variation in the reading test score explained by the family socio-economic background) in Scotland is lower than the OECD average, but is similar to that in England.

Moreover, there are some interesting trends over time which merit further comment. The first of these concerns the scores over time in reading. We can note that there was a sharp drop in the reading score during the 2000-2006 period and then stable results between 2006 and 2012, a drop in 2015 and then the score bounced back to the level of 2012. If we accept the 2015 result as a fluctuation (which may be a methodological issue), we can safely say that since 2006 there has been little change. Similarly in Maths, we can again see a sharp drop between 2003 and 2006 (well before the introduction of CfE) and similar scores (differences are not statistically significant) since 2006. The big decline in scores took place before the introduction of Curriculum for Excellence (and indeed before the period of SNP government), and therefore it is inaccurate to suggest (as many media outlets seem to be doing) that the decline is due to the current curriculum reforms.

Science is the only area where the drop in attainment might be attributed to CfE; here we see stable results between 2006 and 2012, then a sharp drop between 2012 and 2015, and then a slight (but not statistically significant) decline in 2018. It is not clear why this might be the case. We can of course speculate; changes to the specification of content, increased formulaic teaching to the test, lack of accessibility to triple science for the senior phase, and a decline in practical work are possible suspects. However, these are empirical questions, and we simply lack the detailed knowledge of the nature and extent of these trends and their effects. Equally, the phenomena of declining scores in reading and maths between 2000 and 2006 need to be looked into, to increase our understanding of what might have affected attainment. As ever, these issues clearly flag the need for more research.

Another very interesting finding in the data relates to immigrant children. Educational attainment of immigrant children is often considered as an indicator of the success of immigrant integration. Here that news for Scotland is very positive. In reading, second generation immigrant students in Scotland performed higher than or similar to all OECD countries, with only Singapore of the non-OECD countries having a higher performance than Scotland (521). Performance among first generation immigrant students in Scotland was also higher than or similar to all OECD countries (509). The OECD average for second generation immigrant students was 465 and for first generation immigrant students was 440.

In maths, second generation immigrant students in Scotland (512) performed higher than or similar to all OECD countries, with only Singapore and Macao (China) of the non-OECD countries having a higher performance than Scotland. Performance among first generation immigrant students in Scotland (500) was also higher than or similar to all OECD countries in maths. In science, in 2018, second generation immigrant students in Scotland (502) performed higher than or similar to all OECD countries, with only Singapore and Macao (China) of the non-OECD countries having a higher performance than Scotland. Performance among first generation immigrant students in Scotland (509) was also higher than or similar to all OECD countries in science. It is not unusual for immigrant children to perform better than a country’s majority population children in STEM subjects. Yet, the fact that they are able to perform so well in Scotland might offer some insights into why native Scottish children are not doing equally well. One of the reasons could be a lack of interest and motivation, indicating an important area for the policy development.

For the Scottish Government’s analysis of PISA results, see https://www.gov.scot/publications/programme-international-student-assessment-pisa-2018-highlights-scotlands-results/.

[1] In Scotland the study was carried out in 107 randomly selected public funded and independent schools, with about 40 students being randomly selected from each school. Then schools exclude certain students from the sample, if they have additional support needs or language issues. This means that comparison of PISA results with the Government educational performance statistics should be done only with extreme caution, since the latter is produced for ALL students in PUBLICALLY funded schools in Scotland.

[2] Although we note that evidence that this is directly linked to CfE or other interventions such as the Attainment Challenge is limited at best.

3 thoughts on “What do the PISA results tell us about Scottish education?

  1. Thanks for this insight. Can you clarify whether results are weighted to allow for potential differences between public and private school pupil’s results.

  2. From Marina

    The PISA provides weights both for students and for schools in order to correct the selection bias (since not every student and every school have an equal probability of being selected in the PISA sample).
    Once school weights are applied the proportion of independent schools in the sample would reflect the actual proportion of these schools in the total population (all secondary schools in Scotland). Similarly , applying individual level weights would correct the sample bias in the socio-demographic characteristics of the students.
    Since the data have information about the school’s type (state funded or fee paying) it is very simple to disagregate the data and carry out the analysis separately for state funded and independent schools.

Leave a comment