The following is a guest post from Dr Joe Smith, University of Stirling, on the assessment of History using the new benchmarks.

Curriculum for Excellence is presently being equipped with ‘benchmarks’ to clarify what a child at each ‘level’ might be expected to know and do.  In terms of history, this means that Education Scotland have addressed the messy question of progression in historical understanding.  This blog posts explores some of the problems with the proposals. (NB. Some of the arguments here are similar to those I raised in The Curriculum Journal 2016)

….

There exist several models for progression in history education, but all are based on the uncontroversial premise that ‘getting better’ means something other than ‘knowing more’.  There is, after all, a literally infinite amount that one might know about the past and so to say that, ‘I know more history than you’ is to say that ‘I know a tiny bit more about a tiny sliver of the past than you do’.  There is not a totality of historical knowledge against which our knowledge can be cross-checked and this awkward fact means that we can’t assess children based on how much they know, because we, all of us, know very little.

Instead, progression in history refers not to a more complete understanding of the past, but a more sophisticated one.  For example, Shemilt (1983) produced one of the first workable models about how progression in history might be conceived. He argued that children moved through levels of understanding through which the complexity of the past was slowly realised:

  • Level One – There is a story of the past which can be learned. Things happened because they happened.
  • Level Two – There is a single simple story of the past which is easy to learn. Evidence which doesn’t fit this story is wrong. The past could never have been other than how it is.
  • Level Three – There is an appreciation that accounts of the past necessarily differ.
    Level Four – There is a recognition that there is no single story about the past and the nature of the narrative depends on the questions one asks.

Shemilt’s is by no means a perfect model, but it demonstrates how measuring progression requires an assessment of how children are thinking about the past. If teachers must measure children, then they must assess the sophistication of the child’s thinking as revealed through their written and spoken responses. They cannot and must not, simply put a tick or cross against what the child knows (or is perceived not to know).

The approach to progression seen in the new CfE benchmarks contains none of this sophistication.  The benchmarks are problematic in at least four ways.

  • They are, in many cases, so vague that they are devoid of meaning
  • They assess ‘knowledge’ that is pointless
  • They are startlingly undemanding
  • They ask children to behave in a way which is fundamentally unhistorical

In the following paragraphs I deal with each issue in turn, drawing upon examples from the benchmarks.

Vague and Meaningless

The second level benchmark says that a child,

‘Researches a historical event using both primary and secondary sources of evidence.’

This is nothing more or less than a description of the discipline of history. It is possible to achieve this ‘benchmark’ at every level from lower primary to university dissertation. What, specifically, does a child have to do to say that they have met this?  How independent do they have to be? What are they meant to produce at the end of this? How even can you assess the process of researching something? Historians research so that they can produce an account of the past – we assess the quality of an account informed by research, not the act of research itself.

Pointless Knowledge

Recognises the difference between primary and secondary sources of evidence.

The concept of ‘primary’ or ‘secondary’ is not inherent in a source: whether a piece of evidence can properly be called ‘primary’ or ‘secondary’ depends entirely of the questions that are being asked of it.  A school textbook is a primary source is a secondary source about the events is describes, but a primary source to an historian of education. In any case, it’s not even a useful distinction to be able to make. Being able to label ‘X’ as a primary source is of no practical to children. In fact, it encourages formulaic thinking along the lines of ‘X is a good source because it is primary’ which is actively unhelpful to the child’s development of historical understanding.

Undemanding

At the second level, a child of eleven and a half…

Describes and discusses at least three similarities and differences between their own life and life in a past society.

I am pretty certain that my five year old child could meet this benchmark, yet this is the benchmark for a child on the verge of secondary school.  Apart from anything else, this kind of ‘spot the difference’ activity is not particularly historical because it is devoid of explanation: i.e. ‘I put clothes in the washing machine, they use a mangle’ is not really historical thinking.  Whereas,

‘Before the electrification of homes people needed to do their washing on a mangle. This took a lot of time.   Since electrification, we have washing machines which means that we spend less time washing clothes’

contains elements of causation and change.

Unhistorical

Contributes two or more points to the discussion (in any form) as to why people and events from the past were important.

The qualities of important or unimportant (or more properly significance) are not inherent in a particular historical topic, but imputed by the person talking about the topic. The phrasing of this benchmark presupposes that Event or Person X was ‘important’ and expects children to say why that is the case.  The idea that we get to decide for children which actors in the past were and were not significant, is deeply troubling. Instead, the expectation should be that children can disagree about why an event is significant (or even whether it was significant at all) rather than assuming it is important and asking them to tell us why. A better benchmark would be something like ‘Can choose a historical event and say why they think it should be remembered’.

So where has the problem come from?

The fundamental unsuitability of these benchmarks stems from the fact that they are based on the Experiences and Outcomes document which was, itself, wholly unfit for purpose.  I have written about their shortcomings at length (Smith, 2016), but the basic problem is that they were never intended to be used as the basis of a progression model. The Es and Os address two incompatible functions – prescribing content and defining procedural knowledge.  When these two functions are translated into ‘benchmarks’, curious things start to happen. For example, the ‘E and O’ SOC 2-04a reads, ‘I can compare and contrast a society in the past with my own and contribute to a discussion of the similarities and differences’.  This is a worthwhile activity for children to undertake – it asks children to appreciate change and continuity over historical time. However, when it is uncritically turned into a ‘benchmark’, a worthwhile activity loses its value; as eleven year olds are asked to ‘Describe and discuss at least three similarities and differences between their own life and life in a past society.’

Another example is SOC 2-06a which reads, ‘I can discuss why people and events from a particular time in the past were important.’ In this phrasing, discussion is the thing that the child does – the child is writing or talking discursively. However, in the benchmarks the active verb ‘to discuss’ morphs into the passive noun ‘a discussion’ to which the child now contributes.  In the process, any semblance of historical thinking is lost.

So what is to be done?

The great pity here is that there already exists a document which might be used as the basis for more effective benchmarking – the 2015 Significant Aspects of Learning (SALs) document.  Up until very recently, advice from Education Scotland was for teachers to defer to the SALs in planning the learning of their classes, not to the Es and Os.  The 2015 SALs are predicated on an assumption that historical understanding is conceptual understanding; it is not a matter of knowing more.

  • understanding the place, history, heritage and culture of Scotland and appreciating local and national heritage within the world
  • developing an understanding of the world by learning about how people live today and in the past
  • becoming aware of change, cause and effect, sequence and chronology
  • locating, exploring and linking periods, people, events and features in time and place

By using the SALs it is much easier to conceive progression. For example, we have a well-established model for assessing progression in children’s understanding of causation which derives ultimately from Shemilt.

  1. It was always going to happen, hence we cannot explain causation
  2. It was caused by one thing
  3. It was caused by many things
  4. It was caused by many things and we can categorise and prioritise these things
  5. It was caused by many factors which were interlinked and interdependent.

Everyone involved in education in Scotland wants its system to remain among the best in the world, but this means having a clear idea of what ‘the best’ looks like.  Progression models are always simplifications of cognitive development, but they are underpinned by a disciplinary understanding of what ‘more sophisticated thinking looks like’. If we reduce progression to a series of performative tasks, then teachers will inevitable teach to these tasks.  Instead we should be empowering teachers by demonstrating our aspirations for our children and trusting teacher’s professionalism to deliver on it.

Advertisements

7 thoughts on “Some commentary on History, progression and the Social Studies benchmarks

  1. Hello everyone. I’m a bit late in commenting on this but I just returned to it after initially reading it in November. I too agree with a lot of the points mentioned here but I do have a couple of questions/critical comments.

    1. I don’t agree that there is no value in distinguishing between primary and secondary sources. The example given about a school textbook is surely stretching credulity – a History textbook is a secondary source in almost every situation bar the very niche example given above. For practical purposes, it is helpful for children to understand that some sources are produced by people at the time the events are happening and some are produced later with more of an overview of events. Both have their strengths and weaknesses and pupils should be encouraged to engage with them both to tease out what these are. The key is surely to encourage children to analyse all sources on their own merits (although this is very hard – they want a little formula to use!) but I feel having a distinction between primary and secondary is useful.

    2. I also have a real problem with Dr Smith’s comments about historical events. Even assuming the teacher makes no judgement at all and asks pupils to choose their own event and why it’s important, the children are going to rely on knowledge gained from their experience in life so far. This will lead them to pick events which are known to them because other people have decided they are important! Children need guidance in this area – there is nothing wrong with teaching children about people and events which are widely recognised throughout society to be important (e.g. the First World War, the French Revolution, the Scottish Wars of Independence).

    3. My understanding of SALs was that they were to be used to assess the curriculum, not plan or define it. The Es & Os were to be used to build the curriculum and then the SALs (or now the benchmarks) would be used to assess how well pupils were engaging with it. This may well be my misunderstanding – please let me know if it is!

    That is undoubtedly the longest comment I’ve ever left on a blog! Hopefully someone will read it!

    Thanks,

    Martin

    1. Hi Martin,

      I have read and and I found your comments very stimulating. Here are my thoughts in reply,

      1, Perhaps I am guilty of pushing the ‘primary/ secondary’ point too far. I suppose my objection here is to it being framed as an isolated benchmark as though this is something that children can do, tick off and then move on. My point was that the primary/secondary distinction is dependent on what you’re using the sources for and that it only makes sense in the context of a larger enquiry into the past. You are, of course, correct that there is a distinction between sources made at the time under investigation and those which use those sources to produce accounts. But I think to make this the focus of a benchmark exaggerates the importance of this distinction and of what that distinction can tell us. If we are going to set up binaries for children to group evidence into, there are plenty of other more useful distinctions that can be made about evidence e.g. is it a relic or a record? But, yes, perhaps it was OTT.

      2, Again, I think our views are closer than my original wording implies. I agree completely that if children are given no guidance then they fall back on everyday knowledge, which is often riddled with errors. However, I think the idea that certain events are more ‘important’ than others poses all sorts of questions about what criteria we use to define this. Your definition that ‘they are recognised throughout society as important’ falls into the trap of giving ‘society’ a unitary voice. I can make a case for why the three events (if indeed they are events) you have studied are important, but they would be different cases for each one (and differnt from yours, no doubt).

      I agree that schools are inevitably going to have to make decisions about which events to teach, but when events are chosen their importance/ significance should be up for discussion. This is implied in the benchmark, but my concern was over wording – this is an end of P7 benchmark and children would presumably have studied several periods at school and would therefore have knowledge to draw on. If there is to be a discussion it should be about the nature of the nature of significance as something which people have different views on, not ‘contributing two points’ to a discussion about event X in which the concept of signifcance is already a settled one.

      3, I think we’re in complete agreement here. The SALs and benchmarks were intended to check how well children were engaging with it. My argument is that the SALs form a better basis for doing this than the benchmarks do. It would have been possible to use the SALs to create benchmarks of achievement for children, but this is not what hapened – instead, the Es and Os have been used. In many cases the Es adn Os have simply been reworded to make the benchmarks. Es and Os cannot be used to measure progression (as they are a mix of progressive conceptual understanding, prescription of substantive knowledge and generic skills outcomes). My point is that if benchmarks were to be written, this should have been done by looking at the SALs, not at the Es and Os. By just rewording the ‘I cans’ of the Es and Os, we have a series of disconnected tick boxes with no central thread of progression in understanding running through it.

      1. I read Martin’s response and found it very interesting, as I did your original piece. Your follow up remarks to Martin have been helpful and I entirely agree with the point you make with regard to SALs; much more fit for purpose than reworded Es & Os.

      2. Hi Joe/Edith,

        Thanks for responding to my comments. It has been really helpful to read a little more about what you were arguing. I feel the point about what children are studying and the significance of it is a bit misunderstood – they will always be reliant on someone else’s interpretations of why the event is important. It is very difficult for children to have any remotely original insights into why an event was significant. I can’t imagine any of my pupils in S2 or S3 drawing their own meaningful conclusions without significant input from me (which gets us back to the same starting point of someone else deciding for them), let alone P7s. Even by Advanced Higher, many pupils would really struggle to draw their own conclusions without significant input. What I think we would be better doing is trying to demonstrate to them how to think about significant events (with decreasing support and more demand on them as we go) so that when they are at a much later stage of development (maybe university level) they are able to exercise the very difficult skill of reaching their own meaningful conclusions about why historical events are significant (or otherwise!). I do agree, however, that the end goal is a good thing and something we should be striving towards.

        I also take your point about the benchmarks – thanks for the explanation. I agree that the new benchmarks are much more like a tickbox exercise (from what I can see from the draft anyway) and this is not really in the original spirit of CfE. One of my favourite bits of writing was Mark Priestly outlining the friction between an open, teacher-led curriculum (which I think was the original intention of CfE) and the need for national standards, Es & Os, etc. The benchmarks only make this worse and mean that teacher agency is less the driving force behind the curriculum than ever. Ironically, the part of the curriculum that would be well served by being open and different for each school (the junior phase (primary and S1-3)) is becoming more prescribed while the part that needs to be clearly prescribed (S4-6) is still vague and full of ‘illustrative content’!

        Thanks again for responding,

        Martin

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s