Should we welcome Scotland’s National Improvement Framework?

The recent announcement that Scotland is to institute a system of national, standardised assessment in schools has aroused a wave of anger and anxiety amongst the teaching profession – and with good cause. The impact of high-stakes assessment – that is high-stakes primarily for schools and teachers – has been well-documented. For example, there is a wealth of evidence illustrating its negative effects in the No Child Left Behind policy in the USA (e.g. see: Berliner 2011; Dee and Jacob 2011; Hursh 2005; Nichols and Berliner 2007). Similarly, scholars have been highly critical of the impact of assessment policy introduced by successive governments in England (e.g. Alexander 2011; West 2010). In Australia, and with worrying implications for Curriculum for Excellence, the introduction of the NAPLAN assessment framework has seriously undermined the innovative Queensland New Basics curriculum, which had inspired the original development of CfE (e.g. see: Lingard and McGregor 2013; Thompson and Harbaugh 2013).

The National Improvement Framework draft document itself is a further cause for concern. The document makes a case for generating evidence to improve schooling, but is actually rather inconsistent in its own treatment of evidence. For example, the foreword by the First Minister talks confidently about ‘the successful implementation’ of the curriculum and about how CfE has ‘transformed the quality of children’s learning, and their confidence and motivation’ (p.1). At best, these are unevidenced claims in the absence of a systematic evaluation of CfE; at worst they are inaccurate, given that the rather limited existing research on CfE suggests a more mixed picture of partial implementation and often minimal and strategic changes to practice (e.g. see: Priestley and Minty 2013; Drew, Priestley and Michael, in press). There is a need for more  clarity in some of the key precepts of the Framework. For instance, is the purpose of the new standardised assessment formative, and in what ways? Or is it summative, and if so why? Or is the purpose of the data to be generated to evaluate the performance of schools. The Framework seems to suggest all three, and yet is not clear on how this will play out, or indeed about the various pitfalls in each.

It is in the usage of assessment data for evaluation of schools that the greatest pitfalls lie. The Framework lauds the freedom afforded to teachers by CfE, and yet its very existence might be said to threaten that freedom. Indeed, it is debateable that CfE has brought greater freedom to schools, as I have suggested elsewhere (e.g. see: Priestley 2014; Priestley, Biesta and Robinson 2015). The basic issue lies in the distinction between different forms of curriculum regulation. CfE can be claimed to afford greater autonomy because it has relaxed input regulation (i.e. the specification of content and methods). However, this needs to be balanced against output regulation (public accountability through, for example, the use of external inspections and evaluative use of assessment data). My view is that output regulation has eroded school autonomy in many countries more effectively than did any tightly prescribed curriculum. This is because such regulation leads schools to perform – to fabricate image to meet inspection demands, and to teach to the test. A consequence of this, as has been amply demonstrated in the research cited above, has been a narrowing of the curriculum and an erosion of what might be called educational decision-making, as schools develop highly performative cultures (in fact this trend is evident across a range of public services from hospitals to railways). Scotland already has high levels of output regulation, which in my view have impeded the full development of CfE. The new standardised assessment system may exacerbate this.

The above criticisms might suggest that I am wholeheartedly opposed to the new moves to generate system data to improve the educational outcomes of young people, especially those traditionally disadvantaged. In fact, I am not. For a start, I recognise the importance of addressing under-achievement and thus support the aims of the framework. Second, education systems need rigorous data, and arguably Scotland’s attempts to improve its schools have been hamstrung by a lack of such data, including robust research evidence on the enactment of CfE. Third, the introduction of a national system is simply recognising a de facto situation, given that 30 out of 32 Scottish local authorities are already using standardised assessment systems to track progress. It could be argued that by bringing this in-house, the government is introducing greater quality control and standardisation, with the potential to prevent poor and inappropriate use of data by local authorities.

However, the above ‘opportunities’ are subject to some major caveats. First, there needs to be more clarity about what sort of data is sought and how the data will be used – and importantly there needs to be clarity about how it will not be used. It is disingenuous to claim that Scotland does not have league tables, when local authorities use such data – and have done so for years – to compare schools. Second, the government needs to be more rigorous and less selective in how it uses research data. It would be good to see government documents explicitly citing sources, rather than making general statements such as ‘there is a strong body of evidence about the key factors of a successful education system’ (as it does in the Framework, p.3). It would be good to see a more active engagement by the government and its agencies with education scholars who have expertise in this area. And finally, it would be good to see a publicly funded research programme to generate data about the impact of this initiative as it unfolds (with the guarantee that policy will respond to such findings, rather than ignoring them if inconvenient).

NB. The official Stirling School of Education response to the Framework can be found at http://www.stir.ac.uk/education/school-responses/response-to-the-draft-national-improvement-framework-for-scottish-education/.

References
Alexander, R. (2011) Evidence, rhetoric and collateral damage: the problematic pursuit of ‘world class’ standards. Cambridge Journal of Education, 41 (3), pp. 265-286.
Berliner, D. (2011) Rational responses to high stakes testing: The case of curriculum narrowing and the harm that follows. Cambridge Journal of Education, 41 (3), pp. 287-302.
Dee, T.S. and Jacob, B. (2011) The impact of No Child Left Behind on student achievement. Journal of Policy Analysis and Management, 30 (3), pp. 418-446.
Drew, V., Priestley, M. and Michael, M. (in press) Curriculum Development Through Critical Collaborative Professional Enquiry. Journal of Professional capital and Culture.
Hursh, D. (2005) The growth of high‐stakes testing in the USA: accountability, markets and the decline in educational equality. British Educational Research Journal, 31 (5), pp. 605-622.
Lingard, B. and McGregor, G. (2013) High-Stakes Assessment and New Curricula: A Queensland Case of Competing Tensions in Curriculum Development1. In: M. Priestley and G. Biesta (eds.) Reinventing the curriculum: new trends in curriculum policy and practice. London: Bloomsbury Academic, pp. 207-228.
Nichols, S.L. and Berliner, D.C. (2007) Collateral damage: How high-stakes testing corrupts America’s schools. Harvard Education Press. Cambridge, MA.
Priestley, M. and Minty, S. (2013) Curriculum for Excellence: ‘A brilliant idea, but..’. Scottish Educational Review, 45 (1), 39-52.
Priestley, M. (2014) Curriculum regulation in Scotland: A wolf in sheep’s clothing is still a wolf. European Journal of Curriculum Studies, 1 (1), 61-68.
Priestley, M., Biesta, G.J.J. and Robinson, S. (2015) Teacher Agency: An Ecological Approach. London, Bloomsbury Academic.
Thompson, G. and Harbaugh, A.G. (2013) A preliminary analysis of teacher perceptions of the effects of NAPLAN on pedagogy and curriculum. The Australian Educational Researcher, 40 (3), pp. 299-314.
West, A. (2010) High stakes testing, accountability, incentives and consequences in English schools. Policy & Politics, 38 (1), pp. 23-39.

Advertisements

5 Comments

Filed under Curriculum Development

5 responses to “Should we welcome Scotland’s National Improvement Framework?

  1. Biddick, Craig

    Hi Mark

    As a NZ born Head teacher in Scotland I have been voicing such concerns albeit without the same degree of scholarly rigour. To me CfE is a hypothesis waiting to be proven worthy of the rhetoric.As for NIF it is far too rushed to be a sensible strategy to raise attainment. Thanks for the post.

    Regards
    Craig

    Sent from my HTC

  2. Roddy Renfrew

    Mark

    Thank you for a very well informed and insightful post and also for highlighting the related Stirling School of Education response to the government’s call for evidence. It is well worth reading.

    For me, CfE is not the main factor here. Rather it is, as you point out the use of ‘high stakes assessment’. This has been accelerating particularly since the Standards in Scotland’s Schools etc. Act 2000 which required local authorities to ‘secure improvement in the quality of school education.’ Often this was taken to mean mostly an increase in attainment. The government faced a problem when the data in recent years showed a decline in some key assessment areas e.g.from the Scottish Survey of Literacy 2014. The solution it seems is to change the assessment system. This will no doubt lead to a rise in attainment over time. As Wynne Harlen has pointed out, if you leave an assessment system unchanged for a few years there will almost always be an improvement in attainment as teachers and learners come to terms with the requirements of the new system.
    It would have been more useful to commission research into the causes of the recent attainment decline and use this to formulate strategies that would lead to a different kind of improved attainment.

    Regards

    Roddy

  3. Pingback: Should we welcome Scotland’s National Improvement Framework (part two)? | Mark Priestley

  4. Pingback: Assessment can be dangerous….. | Cat's eyes

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s