Category Archives: curriculum

Project Halpin: ‘Cultural Literacy’ (2) – Hirsch, Knowledge and the Learner

This is part of a series of posts that I’ve been writing over a much longer period than I originally planned.  The idea came from a lecture given by David Halpin, in which he discussed the need for us to approach and listen ideas that seemed antithetical to our own.   On the back of that experience and of a growing awareness of the ‘attitudinal bubble’ that I live in, I’ve been reading books which, on the face of it, I might not agree with.

The latest book is E.D. Hirsh’s ‘Cultural Literacy’. In my last post I discussed some of the things I liked about the book, and my concern with the foundations of Hirsch’s theory – that faster reading is a cure-all for educational ills and problems of learning.  In this post I’m starting to discuss the most obvious problem, Hirsch’s somewhat limited conception of what it means to learn, and his curiously passive learner, which I’ll follow up in my next post.

For Hirsch, the underlying assumption of learning is one way. Rather than transaction we have transmission. Rather than active construction we have accretion. Knowledge must come first, communication second. This also renders the world as a passive place, to which we can only apply pre-learned knowledge, which was itself received as a transmission from a more knowledgeable other.

Hirsch’s model of learning is one of assimilation – making things fit in with what we already know.  This is directly related to his focus on reading fluency;  ‘slowness of reading beyond a certain point makes assimilation of complex meaning impossible’ (P.57).  So forming complex meaning is only possible through assimilation and only when this assimilation is achieved through fluent reading. This, in turn, can only occur when reading things, in respect of which the reader is already knowledgeable, or which are culturally familiar to them.   This logically leads us to the conclusion that facts, knowledge must be learned before they can be applied. Assimilation of material here is overlaying confirmation of what is already learned, what is pre-assimilated.

For Hirsch this assimilation is natural – and one dimensional, relying as it does on memorisation and rote-learning. Hirsch objects to the ‘pious rather than realistic’ rejection of this method of learning that he detects in contemporary educationalists. Without supporting evidence beyond the anecdotal Hirsch asserts that children at an early age have an ‘almost instinctive urge to learn specific tribal tradition […] and are eager to master the materials that authenticate their membership in adult society’ (30). He points to the eagerness with which children hoover up the rules of their favourite sport as an example of how ‘memorisation’ should be re-examined as a way of helping children to learn.

This is probably not the time or place to rehearse arguments about rote-learning -v- strategies and ‘which works best’.  Suffice to say that for me, using ‘memorisation’ as a way of learning things seems to be something of a tautology – “We memorise to remember things”.  It also sets up something of a dichotomy which I would argue doesn’t really exist in most classrooms.  It’d be hard to find even the most traditional teacher ‘just telling them’, or relying only on memorisation of facts.  Similarly, it would be hard to find a ‘progressive’ teacher who deals only in subjective opinion, or who facilitates learning only through discovery.  In practice most teachers will tell some things, encourage memorisation of some things, facilitate exploration of some issues, and use discovery and suspense in relation to others.

Willingham suggests that the kind of knowledge that we might pick up only through rote-learning is going to be shallow.  Hirsch can just about make the claim that rote-learning could be a useful method of assimilating new knowledge only because he has set a low bar on the depth of that knowledge.  Hirsh directs us to teach a broad and shallow set of knowledge.  I would argue that by not encouraging students to see the deep structures of problems, shying away from analogies and exploration, and by failing to point out that there are problems with some (not all!) items or types of knowledge, we risk restricting the extent to which knowledge can be transferred to new problems.

We might also missing a trick or two as educators.   There will be key pieces of information that we all want students to know, and to be able to recall.  Repetitive rote learning will play a part in learning these facts.  However, we’ve got to make sure that these are the right kind of facts to be learned in this way, that these facts are correct, and that we give students a chance to see the deeper relations and structures between them.

As an example of the way that the ‘knowledge turn’ in pedagogy risks cutting off learning opportunities from our students, let’s take a look at the idea of ‘knowledge organisers’ in history.  I’ve used these – though I called them ‘glossaries’, and often I took the idea of ‘advance’ organisers as a model (for more on this see Ausubel’s ideas, set out in  Their most recent popularity stems from a post by Joe Kirby on his brilliantly helpful website ‘pragmatic reform’.

There are some really great things about this – the careful thinking that has gone into deciding which items of knowledge the students will need, the focus on common spelling errors, key quotes to add compelling context and colour to students’ knowledge.  This is Hirsch in practice, in that brief, and shallow descriptions in the manner of vocabulary definitions are given.  However, in dealing with some, more complicated, concepts in a shallow way important misconceptions are introduced.

For instance, if we take a closer look at the section on ‘political vocabulary’ we can see that ‘government’ is defined as ‘the political party with the most MPs in parliament’, and ‘political party’ as ‘a group organising to win an election’.  These are obviously problematic when studying Apartheid – the government was much more than the National party, and the ANC, whilst definitely being a political party, as a banned political party was unable to organise to win an election.

Multi-faceted historical contexts are also flattened. Democracy is not defined here, it seems to be taken for granted that students will understand perhaps a platonic model of democracy.  The very point about Apartheid era South Africa is that there were two competing models of democracy. Supporters of the regime would have denied that their country was un-democratic.   Often we can give helpful definitions, but that is not always true, and often we need to do much more. I’m not suggesting for a minute that Joe Kirby doesn’t do just that, but I worry that the impression is that all the knowledge needed can be fitted on one side of A4, and delivered to passively waiting students.

Of course, many people use KOs, and other similar techniques in lots of ways, but these all seem to be much more than ‘memorisation’, and in ways that build much deeper knowledge structures.  Toby French has a great post ( which puts his use of KOs in the context of his wider practice, for instance.  I would go further, however.  Students need time to assimilate, accommodate, test and refine their understanding of concepts about a historical context.  We might start by exploring what democracy, what a political party means to them today, before comparing that with the views and experiences of those living in South Africa during Apartheid.   We might then ask them to fill in a few knowledge organisers (or to explain some concepts) so that we can assess the development of their understanding, and refine our own planning for future learning.

In other words, teachers also need to listen to their students’ ideas, not only to correct their students, but so that they as teachers can refine and perhaps improve their own understanding.  In fact this final point is the thing that people have been trying to teach me for years – from my PGCE tutor, Anna Pendry who gently suggested that I read some more history books, to the parent who rightly complained when I mistakenly told his son that Sefton Delmer wrote for the Daily Mail. The wonderful students who have asked me hard questions that forced me back to my books, and the wonderful teachers who have made me look again at source materials or asked me to think about how I approach what I thought was a familiar topic have, over the years helped me to see that I am also learning.  Unlike Hirsch, I recognise that as teachers we also need to approach ‘truth’ afresh, and sometimes through the eyes of others.  Attempting the delivery of a chunked up world does not help me, or my students, to do that.


Whitechapel 1870-1900

The other day I was asked on Twitter about the books that I read whilst writing the Whitechapel 1870-1900 section of this book.


I’ve got five minutes, so I thought I’d quickly write about one or two of them.   Of these,  Crime in England by Godfrey and the London’s Shadows by Drew D Grey were most useful from a writing point of view.  Grey’s, in particular, is a detailed and interesting introduction into the social context and a real pleasure to read.  Along with City of Dreadful delight it made me think carefully about how I was going to write the book, especially when I was writing about the lives of women generally and those of the Ripper’s victims in particular.

‘Victorian Convicts’ by Godfrey, Johnston and Fox will be very useful as the course goes on and as you start to teach it.  I heartily recommend it (if you’re going to buy a copy of this or all the other books buy it from Mr B’s Emporium, a proper bookshop that’ll deliver just as quickly as Amazon and which pays its taxes and everything else that we like proper bookshops to do).

Online / Electronic Resources

I also used a few books which I read online. Neil R Bell’s ‘Capturing Jack the Ripper’ was really helpful, not only about the particular circumstances of the Ripper case, but also about police procedure, recruitment and life ‘on the beat’.

Some of the geographic bits of ‘the Historic Environment’ were very interestingly addressed by

The Whitechapel Society ( not only contains great articles but also hosts a really great podcast (which Neil R Bell is a regular contributor to).

Of course I spent a great deal of time at a very detailed and comprehensive site, with some very wise contributors.

Because the records of police courts have not usually survived in detail, the best record of the crimes that appeared at the Thames Police Court at this time is the reports in newspapers.  I used the a great deal – as well as the amazingly free , which gave me the moving story of Sarah Fishers’ baby that starts my section of the book.

Lovely Friendly and helpful people.

Whilst trying to find out more about this incident, 615H_GraphicI found that I really needed to see the Attestation Ledgers and Divisional Registers for H Division (Whitechapel’s division).  They’re in London at the Metropolitan Police Heritage Centre and they have not (yet) been digitised. So, I rang them.  I expected that they’d say ‘you’ll have to come down’, but they didn’t. They looked it up for me, and helped me to understand the information on the ledgers. They sent me a pic of the Divisional Ledger, and even put me in touch with the Friends of the Met Police Historical Collection ( who used their forums to help me work out where this event had taken place.  You’ll have to buy a copy of the book to find out what I discovered though :).  I couldn’t have written my part of the book without their kindness and their efforts, and I’m really grateful.

What is a textbook IV What are Textbooks for – in classrooms?

High pile of hardcover books

In my last post on this topic I explored the ‘conditioning’ and ‘coherence’ effects that Tim Oates claims at state level in his policy paper ‘Why Textbooks Count’. In that post I set out my concerns about the way the paper deals with the idea of ‘coherence’, and how this causes Tim Oates to overstate the effect of state or central control over textbooks’ content. This is a theme that I will return to in this blog-post. I’ll also be developing my critique of Oates’ treatment of the idea of ‘coherence’.

I want to focus on the effects that Tim Oates claims for textbooks at the classroom level in key jurisdictions.  Along the way I’ll suggest more ways in which the papers suffers from evidential problems, and point out the rhetorical devices which, in my opinion, are the sine qua non of a policy paper, rather than a piece of ‘New research’ as is claimed by the page title on which it is posted.

According to Oates, at a classroom level well-designed textbooks ‘free up teachers to concentrate on refining pedagogy and developing engaging effective learning’ (p.4). In ‘key nations’ they have also ‘been developed to support highly effective pedagogic practices’, where they also ‘encourage clarity regarding key concepts and core knowledge, [and] provide clear learning progressions’. This support enables teachers to provide ‘enhanced responsiveness to individual learner need’. None of this is controversial. I recognise these advantages from my own practice – and it would not surprise me if ‘high performing teachers [were] most supportive of the use of well-designed textbooks’ as Oates claims Reynold’s and Farrell’s 1996 study suggests (though I must admit I can’t find this claim in that study). I am also reminded of my own search as teacher and HOD for textbooks that were coherent with the curriculum aims that I had for my students.

An ambiguous comparison

One of the key rhetorical devices that Oates uses is a sense of crisis, of a problem with a daunting scale in the quality and use of textbooks in England.  In order to support the idea that the use of textbooks is ‘dauntingly’ low and problematic a confusing picture is then painted in which responses to the TIMSS 2011 questionnaire items about the use of textbooks from English teachers are compared with those of Finland and Singapore. Oates sets out the very low numbers of English teachers reporting that they use textbooks as ‘a basis for instruction’, and contrasts these with the much higher proportion of teachers from Finland and Singapore who report that they use textbooks as ‘a basis of instruction’.

However, the questionnaire itself asked whether teachers use textbooks as ‘basis of instruction’, or as ‘supplement’, not as ‘the basis for instruction’ nor yet as ‘a basis of instruction’ as Oates sets out in his table headings. I’m aware that this might seem like taking close reading to a pedantic level, but as I hope we’ll see – it’s important.

It is suggested that this comparison will ‘confound many common assumptions about these three countries’, though only two such assumptions are hinted at, which arise out of ‘high level messages regarding high school autonomy and learner-centred pupil support’ in Finland. I’m forced to presume that Oates thinks that from these high level messages many people assume that the Finns don’t use textbooks. This is not something that I’m aware of. If people did think that then they’re obviously wrong. Finnish teachers obviously value textbooks, as ‘basis of instruction’ and as ‘supplement’ (or whatever they were asked in Finnish).

In which case what else are we supposed to do with this comparison? Well we’re told that the problem (the one with the daunting scale) is ‘low use and low quality’ of textbooks in England (p.4), and here we’re asked to compare the use of textbooks in England with that of Finland and Singapore. Both are bywords for high performance in international comparative tests, whereas English performance in such international comparisons is commonly assumed to be dire. Is Oates comparing dire England’s ‘low’ use of textbooks with the high use of textbooks in high performing Finland and Singapore?

Why doesn’t Oates join these dots more directly? Possibly because there is no correlation between the proportion of teachers reporting that they use textbooks as ‘basis for instruction’ or as ‘supplement’ and a country’s TIMSS 2011 scores at any of the benchmarks that TIMSS used. In fact we might be able to confound a few assumptions by pointing out that in maths, far from being dire in comparison to Finland’s performance, they come out at similar levels, despite the ‘problem’ of low use of textbooks. Performance in Singapore was stratospherically better than in both England and Finland, yet in Singapore teachers report that they use textbooks as ‘basis for instruction’ less than they do in Finland.

A flawed comparison

What we can say from the TIMSS 2011 questionnaire is that in England 74% of 4th grade teachers of maths report that they use textbooks (either as ‘basis’ or as ‘supplement’) in their lessons, whilst 88% report that they use workbooks or worksheets, either as ‘basis for instruction’ or as ‘supplement’.

The omission of an ‘a’ or a ‘the’ in the question about ‘basis’ or ‘supplement’ is important and makes it harder for us to understand the responses of the TIMSS teachers. When I taught classes I used a textbook in the vast majority of my lessons. I would often use it as ‘a basis of instruction’ but not ‘the basis of instruction’. Sometimes I would also use a second textbook as a supplement if I felt that material was better covered therein. If I was asked whether textbooks were ‘the’ basis of instruction in my lessons I would say “no”. My own knowledge was “the basis of instruction”, though often a textbook would be used as ‘a basis of instruction’, with my knowledge informing how this instruction was done.

So, all in all we can say English teachers are less likely to report that textbooks are ‘basis of instruction’ in their lessons than in all other countries, although 78% report that they use textbooks in lessons. This is less than other countries, but there may be special reasons for this, as when English teachers teach maths and science they often use bought in worksheets.

So, what was the point of this comparison, if we can’t join the dots and say that using textbooks as ‘basis for instruction’ is more likely to lead to better outcomes? To understand the use that Oates puts the comparison to we’ll have to pick over this sentence:

With levels of use lower than other jurisdictions, and very low levels assigned to ‘as a basis for instruction’ [sic] what is interesting in England is the existence of an underlying ‘anti-textbook ethos’, and its location in teacher training and educational research communities,

It’s not clear whether Oates is suggesting that the TIMSS survey is evidence of an anti-textbook ethos. The two things are put side by side and, again, it seems it is up to us to join the dots.

However the ‘existence’ of this ethos is assumed. As we have already seen, we don’t know if in the TIMMS survey there were low levels assigned to ‘as a basis for instruction’ because that’s not what was asked. As we will see, it’s location in teacher education is only perilously supported by anecdote.

Oates cites Marland as evidence. Marland’s analysis is described as ‘comprehensive and penetrating’ (p.8). Oates seems to be suggesting that a survey in which 78% of science teachers responded that they used textbooks in lessons supports Marland’s findings of an anti-textbook ethos. If we read the extract from Marland this ‘comprehensive’ analysis suggests that that there is little evidence in the literature for the proposed anti-textbook ethos other than anecdote. My own anecdotal experience as a teacher trainer at university and as a mentor in school would suggest that trainee teachers might be discouraged from using textbooks as ‘the basis for instruction’, and positively pushed into using it as ‘a basis for instruction’.

Context and Historical Analysis

The paper is very critical of those wishing to explain Finland’s success in simple terms of contemporary school and professional autonomy without exploring the historical context of the country’s success, or of its current relative autonomy. This would imply that any analysis of the seemingly low use of England textbooks in maths and sciences which did not consider the historical context of this pattern of use would also be flawed. This paper makes no attempt to consider such a history, despite finding space for a (also flawed) consideration of Finland’s recent educational history. The TIMSS survey in 2011 came after the end of the New Labour years of national strategies, specified schemes of work and centralised resources distributed in packs of booklets or downloaded from the DCSF website. In these circumstances it is not surprising that teachers in science and maths report that they are less likely to use textbooks as ‘basis for their instruction’.

Furthermore we need to consider Oates’ own definition of ‘textbook’ in his paper, which he gives us as:

rigorously-designed paper-based materials which can include textbooks for teachers’ use, textbooks for pupils and pupil workbooks.

This is a wide definition, and elements of it could easily fall within the ‘worksheets or workbooks’ as well as within the ‘textbooks’ that English teachers were asked about in the TIMSS questionnaire. So, workbooks mean textbooks to Oates, but the TIMSS treats them separately. We know that in many English primary and secondary schools science materials are often purchased in the form of photocopy master worksheets. These too could be defined as ‘rigorously designed paper-based materials’. So, another reason that so many teachers report low use of textbooks could be because their school doesn’t buy in its ‘rigorous paper based materials’ in the form of books, but in the form of worksheets.

Fundamental flaws in Oates’ claims

However, there are real problems with Oates’ claims about (1) the quality of English textbooks, and about (2) the potential for centralised control over the content of textbooks to combat these perceived shortcomings. It may be that England’s textbooks are of an inferior quality, though there is not enough evidence in this paper to support such a claim and such evidence that there is suffers from important methodological problems.

In our discussion at Oates presents us with a ‘method statement’. Interestingly this statement is not in the paper itself – the ‘New research’ that was published on the site.

In the paper itself we are told that 200 textbooks were collected and used ‘as part of the transnational curriculum content mappings’, which then allowed for ‘further analysis of the qualities of the textbooks themselves’. We are also told that the textbooks were ‘documented for the different kinds of information elements which [they] contained and the manner in which [they] presented these elements’. The assessment of these textbooks was based on ‘the coherence of the text based on either correspondence to a stated model (eg spiral curriculum) or to an obvious form adopted in the text.’.


The treatment of the issue of sampling is a key problem in this paper. In the ‘method statement’ on the website above we are told that:

“Just over 200 books were examined, obtained from Singapore, Hong Kong, Finland, Alberta and Massachusetts. […] Books in Science, Maths, History, Geography, and English/Native Language and Literature were collected and scrutinised, covering both Primary and Secondary. Books on early reading were included and were of course focussed on early Primary.[…]”

In the paper we are asked to compare 3 textbooks – one from each jurisdiction. The problem with this is obvious. Each textbook is made to stand for the textbooks of its jurisdiction. If we were drawing conclusions only about those particular textbooks then these conclusions might have weight, but instead the each one is held out as representing all the books in each jurisdiction. The choice of a KS4 book in England is an interesting one and brings us back to the definition of ‘textbook’. KS4 textbooks vary widely in audience and in purpose which leads them to vary widely in content and tone. Is this a class text? Is it a revision textbook? Is it one that has been approved by the examination board (and which therefore has to contain information and examples which relate directly to the examination)?


The method for the study is also not outlined in detail, and the method itself seems to have been forgotten when data was collected and analysed. There are two steps in the method: – ‘documentation’ of the different kinds of information elements, followed by ‘assessment’ of the coherence with a ‘stated model’ or ‘obvious form adopted in the text’.

Turning first to the idea of ‘documentation’. This is an interesting choice of word. I would take this to imply the production of a non-judgemental record of the elements of the books, so that this would allow further analysis and supported judgement. What we are not told is the framework by which the documentation will take place. Beyond the aim of documenting ‘key elements’ We don’t know if the civil servants who did this analysis were looking for visual elements, explanatory elements, information elements.

When we look at the Case study texts ‘extracted from the textbook analysis’ we can assess how closely the method stuck to the principle of documentation, and an idea of the un-written analysis framework starts to emerge. Documentation in fact seems to involve a fair amount of evaluation, which would not be problematic in itself, if we were told the framework around which the evaluation of the different types of ‘element’ were taking place.

So, the ‘Elements’ in the maths textbook from Hong Kong starts with what seems like documentation. We are given a list of elements as follows:

  • Statement of Pre-requisites
  • Review activity to determine whether pupil is ready for the chapter
  • Different forms of the equations of circles
  • Features of circles from the equations
  • Equations of circles from the different given conditions
  • Intersection of a straight line and a circle Inclusion of a series of problems
  • Check through assessment: 6 problems, 1 practice exam Q, 1 lively maths problem

What we have is a list of things that are in the textbook. We could quibble about the ‘lively maths problem’ and ask about what makes it ‘lively’. A similar list is given for the textbook from Singapore. By the time we get to the ‘documentation’ of the English textbooks we are told that the IGCSE textbook has:

  • Clear statements of mathematical ideas
  • Clear statements of operations
  • Some sample activities

The GCSE English textbook ‘documentation’ is cursory at the least, and seems to have been done with a pre-conceived judgement in mind:

  • Extremely diverse content within diverse structure – complex
  • Divided into Higher Tier and Lower Tier elements to match examination 299 pages long
  • Sample full GCSE exam paper very early in the text: p11

Are the ‘sample’ activities set out in the IGCSE textbook divorced from the type of questions that will be asked in the final exam? I don’t know, and we are not told. What makes the IGCE textbook statements of ideas and operations ‘clear’? I don’t know, and we are not told. How do they compare with similar statements in the GCSE textbooks. We don’t know, because we are not told. What does it mean that the English GCSE textbook is ‘diverse’ in content and structure? How is ‘complexity’ judged? We go from a list of elements with some implicit approval of the Asian books, to clear but un-supported critique of English GCSE textbooks which abandons any attempt to ‘document’.

This tendency to eschew documentation and skip straight to judgement is best illustrated by the recording of the ‘Key Features’. These consist of value judgements, rather than documentation or even of an analysis of ‘coherence’. So, the Hong Kong book contains ‘important’ evaluation, ‘good’ elaboration, and the Singaporean book ‘extremely clear’ statements and ‘good’ elaboration, whereas the English textbook is ‘rather incoherent’.

Furthermore the second stage, or the analysis of coherence with a ‘stated model’ or ‘obvious form adopted in the text’ is not further explained, though we are assured of the Singaporean and Hong Kong texts that coherence was ‘impressive’. In the paragraph which seems to draw conclusions instead we are told that texts from Hong Kong and Singapore had ‘extremely clear presentation, explanation and reinforcement of key concepts and ideas’. We are not told which stated model they cohere with, nor any obvious forms adopted in the text around which they cohere with. We are not told what the difference or similarities are between a ‘model’ or ‘obvious form’, whether this is a model of presentation, learning, cognition or whether we are looking at obvious forms of ‘style’ or otherwise. This lack of definition makes it impossible for anyone attempting to look at the same sample of 200 books (should this list be made available) in order to repeat the analysis.


All of this brings us back to the central issue of coherence. Even in those jurisdictions which Oates chooses to focus on, where there is (or was) state control over the content of textbooks, there is evidence that the force of ‘coherence’ is (or was) provided by a wider agency than a central curriculum authority and that a good deal of it came ‘upwards’ from the classroom and from teachers, as well as from society as a whole. In Hong Kong we are told that textbooks tend over time to become similar as they follow the market leader, and thus are affected by teachers’ views as well as by the self- censorship that publishers undertake to please the actual censor. Not only that, but attempted innovations fail because they are rejected by the teachers – interestingly it is the teachers who are providing coherence, as well as the government approval system. In Singapore we also read about the ‘panel of professionals’ drawn from ‘curriculum specialists, teachers and academics from universities’ who approve textbooks. In my last post on this topic I discussed the societal shifts and coalescence which resulted in the education reforms of Finland in the 1960s and which provided wider coherence to the implementation of these reforms.

As we also saw in the last post in this series, Schmidt’s paper on coherence suggests that teachers will place authority and trust in curricula materials which cohere with the curriculum as it is enacted. If we accept Oates’ argument about narrowness and the instrumentalist approach of English textbook publishers (though I don’t think that we have to on the basis of the poor evidence that Oates supplies), and his evidence that teachers’ views and practices have caused the failure of state sponsored changes to textbooks in other places then we must look at the curricula system as a whole in order to find out why textbooks are like this in the English system, not apply state level approval of texts as the primary guarantee of ‘coherence’.

This all boils down to my basic problem with this paper – that it does not reflect the critical-realist methodology that Oates has claimed for it. Of course textbooks count – but the important questions are the extent and the mechanisms by which they count in particular circumstances. My worry about the analysis in this influential policy paper is that these circumstances are only referred to in just enough detail to provide legitimacy for the threat of state intervention.

5 great podcasts for history teachers*

man using antique listening deviceMy job means that I’m quite often in my car, and therefore listening to my radio.  Unfortunately, this often seems to coincide with ‘moneybox live’ or Chris Evans.  In response to this terrible conjunction, I’ve fallen back in love with podcasts.  My subscription list is all the best bits of radio 4, with added shows that radio 4 should commission, and without ‘quote-unquote’ or ‘the unbelievable truth’.  Recently I’ve heard some fantastic episodes which I think could be used in the history classroom – either as inspiration for lessons, as CPD for those wanting to improve their knowledge of a topic, or as something that could (with cuts and tweaks) be used directly with pupils.  I thought I could share these with you.

In Our Time

Of course I’m going to start with In Our Time.  Consistently brilliant and always challenging (especially when it’s about quantum physics), In Our Time occasionally serves up an episode which you immediately want to turn into a scheme of work.  I could talk about the ‘Lancashire Cotton Famine‘, as an example which could help us teach the history of industry and the end of slavery together – with a really global reach.  I might urge you instead to listen to the staples of ‘The Armada‘ or ‘Suffragism‘ if you wanted to learn more than the basics about these important events.  When I start some new teaching or writing on a history topic, often the first search I make is of the IOT archives.

History Extra

Until recently I didn’t listen to History Extra,  I didn’t like the early podcasts.  It felt to me like a marketing exercise, and there seemed to be a lot of military history.  Recently however I listened by chance to an episode about the dissolution of the monasteries, and a piece on Surinam, which was a really interesting explanation of the links between state, trade and colonialism in Stuart, Civil War and Restoration Britain.  This episode, which was excellent, earned the show a place in my podcast schedule.  The next episode on Charles II was even better.  Listening to Claire Jackson’s fascinating and nuanced views of the character of Charles II  (or even better, buying her book) would be a great first step for teachers of the Restoration British Depth Study from AQA’s 2016 GCSE and I urge you to give this a go.

History Pod

This well researched podcast is great, and produced by Scott Allsop, a proper history teacher.  Scott’s ‘on this day’ type podcast often reaches parts of history that the others cannot.  A recent favourite was the episode about the flying cow. Listen to it – you won’t be disappointed.

The London Review of Books

Just great for broadening one’s mind generally but also, every so often, there are great episodes with a history focus – like the recent one given by Colm Tóibín on the cultural and political run up to the 1916 Dublin uprising.  If you really want to know why ‘All changed, changed utterly’ then this is a good place to start.

The British History Podcast

This is another recent addition to my podcast list.  Its written and presented by a British ex-pat who lives in the US. It’s unashamedly narrative driven, but takes this as an opportunity to cover the stories of British history in an engaging way, as well as often from unusual perspectives.  I’m only a few episodes in, but already I’m hooked.

*and a bonus episode- More or Less

Strictly speaking this great podcast isn’t really a history show.  A few episodes ago however there was a great piece about the ‘story of average‘. Average, as a human construct, has a history and therefore a story of development which is not only interesting, but which I think helps us to understand why average is the way it is (and how it is used) today.  I wonder how much more successful my own mathematics education could have been if it had taken a more historical approach.

Undergrad Day

17303176035_035cd2da96_zToday is undergraduate day. I’m teaching a module on SEN in the secondary school to my undergrad PE and Secondary Ed students in May, and I want to be well prepared. I read an interesting study by Benjamin Bloom earlier in the year (1984) about ‘Mastery’ and his attempts to solve the ‘2 sigma problem’, i.e. the 2 standard deviations in increased attainment that he found between pupils taught in ‘conventional classrooms’ and those who were instead ‘tutored’ one to one or in very small groups.  I think that aspects of this study can help me with my students.

This study seems to be one of the original studies that informed the current vogue for ‘mastery’ approaches in teaching and assessment. The recommendations are for iterative cycles of formative testing which allow a student to reach the desired ‘mastery’ level of attainment. I’ll not go into that now (perhaps I’ll plant a seed in the ‘post-garden’ and come back to it later). Suffice to say that I think that Bloom underestimates the time cost, and fails to make out what he really means by mastery (80% in a test score is the usual level – which we can see means pretty much nothing).

What grabbed me more is an idea that that Bloom develops from Leyton (1983) of techniques that “enhance the students’ initial cognitive entry pre-requisites” (who said that educational research can’t be easily understood?!). Broadly, this means ‘making sure they know and can do the things they’ll need to be able to do before they start to learn the new things that you have to teach them’.

Today I’ll be reading through my course materials, looking at the development activities I want them to do during the 10 weeks of the module, and working out a list of these ‘prerequisites’.  I’ll then scrap the first week’s sessions and turn them into a ‘prerequisites’ week.  I might have to think of a snappier title… any suggestions?

Where will this lead?  I’m hoping to make an ‘knowledge organiser’ which the students themselves have to complete, and which I’ll then check over formatively.  I’m sceptical that an organiser on its own will do anything (I need to make sure they read it and commit the ideas to memory for a start), but I’m hoping that if they have a first go at coming up with the ideas, which I then correct, this will give me an idea of where they’re coming from, and them a couple of chances to understand the material they need to know.   I’m hoping that my prerequisites audit will also help inform decisions about the way I structure the workshops and seminars that follow, as well as the content of the weekly lecture, as well as giving me some clear hooks and points to attach to ongoing quizzing.   I’ll let you know how it goes.

Bloom, B.S., 1984. The 2 sigma problem: The search for methods of group instruction as effective as one-to-one tutoring. Educational researcher, 13(6), pp.4-16.

Historic Environment Studies – AQA in more depth

Water in English Gardens (22 of 33) | Hatfield House Gardens, Hertfordshire, UK.Last week I took an overview of all the environment studies. Though they’re (mostly) worth around 10% of the GCSE I wonder if they’ll be giving many HODs and teachers something to worry about as they start to think about their choice of board and specification. This is mainly because they concept of an historical environment study will be new to many teachers, especially those who have been doing modern world teaching (as I have).

This week I want to look more closely at AQA’s offering. They’re interesting because they are so closely embedded in with the depth study that they’re associated with. The questions allow students, (actually require) students to use their knowledge of events and society in the period studied, it’s fashions and pre-occupations in writing answers.  This means that the period study content should be read side by side with that of the H.E. study. Also, the kinds of locality that are implied for each H.E. should be taken into account when planning which unit to teach.

The Medieval Units

The two early periods have a strong military focus.  The Norman period could imply studies of early castles, such as Pevensey, whilst the Medieval unit, with it’s focus on the conquest of Wales suggest the development of castles such as Builth Castle in Powys.  The earlier Norman period has a focus on military tactics and innovations that is not present to the same level in the Medieval study, though both units mention battles that could be the focus of future H.E. assessments.

However, both also have strong social history aspects. So, whilst the Norman period has a focus on the village which would enable the board to set a medieval village location, and a focus on the changes that the Normans made to Cathedrals and churches, the Medieval study focuses on the development of towns.

The Early Modern Units

The Elizabethan unit is the one I find hardest to pin down to particular locations, or types of location. The focus on the rise of the Gentry and of living standards might mean a focus on the homes of the nobility – indeed this is the focus chosen for the specimen assessment material.  We could also read into the content on the church a study of Protestant or Catholic places of worship.  The spec also mentions theatres, so putting a tenner on the Globe being one of the locations might be an option.

The unit on the Restoration has more to go on in terms of possible focuses for H.E. locations. Theatre is an obvious choice, as is Medway in Kent, the scene of a famous naval disaster.  The big star of this unit seems to be London, with a focus on the plague of 1665, and the fire of the following year, coffee houses and Samuel Pepys, the focus on fashions and the changing face of the city being obvious.

The Specimen Assessment Materials

Whilst looking at the specimen assessment materials confirms how much these H.E. studies are embedded in the context of each depth study, common threads in the approach to assessment across the studies do emerge. For instance, the questions emphasise the context of each locality, asking about the use of castles to control areas in the Norman and Medieval studies, or Restoration fashions reflected in Bolsover Castle.  The mark schemes show however that there are strong preferences for answers that focus on the design, materials, as well as the symbolism of the various features of the locality concerned.  This is really exciting stuff – students will be given an opportunity to get to grips with the physical aspects of the past that we have not had the opportunity to introduce them to. Additionally they will be asked to think in terms of the mentalities of the past, to understand how buildings and places had such an impact on the minds of those living in the periods we’ll be studying.

H.A. Northern History Forum: Global Learning

Wednesday’s HA event at Leeds Trinity had a stall manned by Pearson which set out their ‘Global Learning Programme‘.  At the start of the keynote we were told of a CPD event being run by the university (and paid for by it too) deisgned to celebrate work being done by teachers on ‘Global Learning’. Global Learning is clearly ‘a thing’ right now.

The HA website has more details of its take on Global Learning, and I understand that they have been helping Pearson to develop the programme, offered on a website here. It’s hard to argue with the HA’s point that

“much of the history curriculum provides a clear context for the current debate about poverty, globalisation and inter-relationships between the countries of the world, and helps students understand the current debate.”

My mind is also drawn back to Donald Cumming’s talk to the SHP conference in July 2014 in which he rightly pointed out that we cannot really understand the history of any country (and perhaps especially not the one in which I live and teach) unless we understand the history of the countries around it and the wider world. Globalisation and global interdependency are not recent developments, and we’re not really teaching history if we deny this to our students.

Whilst I was reading the key aims of global learning cited by the GLP and the HA, I wondered about the kinds of substantive topics that we could use to help achieve these various aims to

help young people understand their role in a globally interdependent world and explore strategies by which they can make it more just and sustainable,

familiarise pupils with the concepts of interdependence, development, globalisation and sustainability

enable teachers to move pupils from a charity mentality to a social justice mentality

stimulate critical thinking about global issues, both at a whole school and pupil level

help schools promote greater awareness of poverty and sustainability

enable schools to explore alternative models of development and sustainability in the classroom.

It seems to me that there are many substantive topics that we could use in trying to reach these aims.   I can also see that thinking about these aims could encourage us to think differently about how we can ask students to think about the past from a global perspective.   Most obviously a comparative ‘long view’ approach of the kind developed by  Shemilt and Rick Rogers offers us a way of brining a historical eye to these aims. By comparing and contrasting different modes of trade, causes of poverty and wealth, and the development of campaigns against injustice over time we can help students understand how people in the past have wrestled with these issues.

If I can, I’d like to go to the conference, if only to see what it means to ‘enable teachers to move pupils from a charity mentality to a social justice mentality’.  It is this aspect of ‘global learning’ that causes me most trouble, and has since I started teaching.  When teaching histoy we are, in my opinion, teaching a way of thinking, rather than what to think about a particular event.  History doesn’t guarantee that our students will have a particular opinion about a topic, but should aim that they are well informed enough to form an opinion that is well-supported.  There are no single right answers to many historical questions, though there are lots of wrong ones!

So, I need to clear up what it means to be “moving students from a charity mentality to a social justice mentality”, so that I can make sure that I’m not trying to replicate my own mindset or political views in those of my students.,7837_127.html

Historic Environment Studies at GCSE

cropped gargoyle-1.jpgThere are big changes coming at KS4.  Others have written excellent posts summarising the new specifications and the differences between them.  On reflection there’s something for everyone in most specs – we will each find some aspects that we seem to be familiar with.  However, there is one new part of the GCSE – the Historic Environment Studies which are really new to most GCSE teachers.  I thought I would take a look at the differences between the different specifications in overview.

Board % of Grade Embedded in another unit? Specified site or centre choice? Topics
AQA 10% Yes – in British Depth Study Specified three years in advance (1) Norman, Medieval, Elizabethan and Restoration historic environment
Edexcel 10% Yes – in Thematic Study Specified in spec. (2) Crime and Policing in Whitechapel from 1870 to 1900
Surgery and Treatment on the British Western Front 1914-18
London and the Second World War 1939-45
OCR – SHP 20%(4) No – though centres can do this Centre choice (3) Centre choice within ‘parameters’
OCR 10% Yes – in British Depth Study Specified in spec. Urban Environments: Patterns of Migration
Castles: Form and Function 1000-1700
  • (1) – AQA will announce the sites when approved by Ofqual
  • (2) – ‘Site’ is widely construed to mean ‘London’, ‘Whitechapel’ or even ‘the Western Front’.
  • (3) – There are guidelines to help centres make the choice in the spec.
  • (4) – OCR – SHP spec examines the historic environment study in a separate paper.


AQA’s historic environment studies are embedded in their British depth studies, and focus on specific aspects of the wider content related to those studies. Departments that follow the ‘Norman England’ option will therefore study ‘the historic environment of Norman England’, while those taking ‘Medieval England’ will study ‘the historic environment of Medieval England’. It doesn’t take a genius to work out that the departments teaching Elizabethan or Restoration England will also be teaching about the historic environment of each period.

The focuses in each ‘historic environment’ study depend on with those of the rest of each depth study, but there is a fair amount of generic description. So, whilst Elizabethan England refers to manor houses, gardens and theatres, and the Restoration period refers to ‘stately homes’, the Norman period mentions ‘Cathedrals’ as well as ‘Castles’ which also figure in the Medieval description. Each depth study refers to ‘key historical events’, though the only illustration given in each case is ‘such as battles’.

AQA plan to publish the specific sites for each exam series three years in advance on their website. I can’t find reference to these yet, though I’m sure that they have planned the first three.

Update: following a very fast email response from AQA, who tell me that: “We will be publishing the sites three years in advance (it’s in the draft b specification), so for example, once we had an indication from Ofqual that this will be acceptable we will publish the sites for 2018, 2019 and 2020 to help teachers plan their courses. We’ll also be providing individual resources packs for each site and overall guidance for schools.”

The Historic Environment makes up 10% of the total marks in AQA’s GCSE


Like AQA, Edexcel’s Historic Environment component is embedded in another study, though in this case it is the thematic rather than the depth study.  At first sight this might imply an approach which considers how and why a site changes through time.  However, AQA have set out much shorter time periods in which the Historic environment studies take place. For instance. though the Crime and Punishment In Britain study, runs from 1,000 to the present day, the embedded historic environment study is a much more focused thirty years, from 1870 to 1900 and is focused on the issue of crime and policing.

Similarly the Medicine through time study, which runs from 1250 to the present, contains the embedded historic environment study of “The British sector of the Western Front’ and is focused on the years 1914-1918 and the issues of ‘surgery and treatment’. This pattern is repeated in the Warfare through time thematic study. The London and the Second World War option runs from 1939-45, though it lacks a focussing subtitle in the way that the others have.

The Historic Environment makes up 10% of the total marks in AQA’s GCSE.


OCR is offering two different specifications at GCSE, and each has a very different approach to the historical environment.


The Schools History Project approach to the historic environment immediately sticks out from the crowd of the other three offerings.  The SHP-OCR specification it is 20%, double the tariff of the other specifications. It is also the only specification to assess understanding of the historical environment in a separate exam.

The second and perhaps most significant difference is that the specification ‘offers centres a free choice of site within a clearly stated set of parameters’, with the hope that this will lead centres to study a local site ‘that will enhance learners’ developing sense of identity’. The choice of site is not totally free, as there is a list of ‘parameters’ (though these are really guidelines to help centres choose workable sites).  Like the other boards there is no ‘requirement’ for a site visit, but the specification does say that one is ‘desirable’.  There is no requirement for the study to relate to any other part of the specification, though I would imagine that many schools will choose to find a site related to the periods and substantive history that they will be teaching elsewhere in the course.


The alternative specification, in common with those offered by the other boards, embeds the historic environment within another study. Also like  most of the specifications set out by the other boards, the historic environment study makes up 10% of the final marks of the GCSE.  Like AQA, OCR have embed their historic environment study within the British depth study.  There are two environment studies. “Urban Environments: Patters of Migration” is the study for the BASA ‘Migration to Britain’ depth study, whereas for both “The English Reformation” and ‘Personal Rule to Restoration’ depth studies centres will take ‘Castles Form and Function 1000-1700’. This approach seems to imply an aspect of change and continuity that the others do not.

This approach also differs from the other specifications in that it involves both a Board and a centre specified site which ‘complements the specified sites’.  Again a site visit is ‘desirable’ if not required. The sites for both studies until 2022 are set out in the draft specification.

I will be making a more detailed survey of each of these specifications in the coming weeks, starting with the AQA spec.  I’d love to know what departments are thinking about doing with regard to the historical environment study – or whether it has figured much in your thinking so far?

Great History spotted on the Web

I  have read such great history around the web this week, that I thought I’d compile some of it into a post.  Most of what I’ve spotted here should be directly relevant to teachers at KS3-5.  A good example of that is the fascinating life and views of John Lilburne, described by Michael Braddick over at Sheffiled University’s History Matters blog.  Lilburne provides the kind of life and example that we can use to great effect in the classroom, in order to ask questions like ‘why did he spend most of his life in prison’, or perhaps considering his historical significance (or lack of it!).  We might instead compare his treatement to that of Henry Vane or John Lambert in asking whether the Restored Monarchy was really as reconcilatory as the Declaration of Breda might suggest.

I have also been listening to the amazing podcasts at Alex Ford’s Meet the Historians.  This is a really exciting and ambitious project to enable students to access the thinking of historians through an interview with their history teacher.  Alex asks some really interesting questions, and the historians are given the time and space to answer. What I like most about the series is the example that Alex sets to us and his students.  What I take from this is that history is not only something that we have learned, it is something that we do, and something that we can keep on doing.   Secondly,  I think about the times we might vaguely exort our students to ‘interact’ or ‘engage’ with the sources. From this podcast students can hear, and perhaps understand what interaction with a historian might really mean.

I was drawn to this post by Scott Allsop‘s tweet.  The article itself is bit rambling, but it gets interesting right about where it discusses using diagrams to show relationships between countries, and in its central idea that devices such as these can ‘force us to expand our conception[s]’.  I often use diagrams and simplified maps to try to explain complex things. As always the devil is in the detail, but figuring out the detail can help students to understand where the limits of their knowledge are, and to put the detail back into the big picture.  As an example I asked my own students to update their diagrams of the feudal system last week.   They came up with some interesting ideas, including a feudal donut, with the king in the middle.  One really interested me.  It showed a house with a small dank cellar in which slaves worked, and two lower floors for Villeins and Freemen.  These floors were connected with stairs, which also led to the upper knightly and aristocratic floors, and finally to the attic where the King resided.  Crucially some of the stairs had baby gates installed, to make it harder for people to move upwards. This made it nearly impossible to become ennobled, but relatively easy to slip between free and unfree status depending on whether you could afford to rent land.  I can’t claim that this is a finished or full understanding of the feudal system, and I think that what’s going on here is the replacement of weaker for stronger misconceptions. It’s certainly better than the boring old Feudal pyramid that in the past I have taught in one lesson, and which then they forget.

Thinking about how history is done, over on Gaby Mahlberg‘s blog there is a really interesting post in which she reviews Writings of Exile in the English Revolution and Restoration by Philip Major.  The book itself deals with the culture of Royalists at home and those exiled beyond England and seems to offer a glimpse into the way Royalists dealt with the dislocation and loss that comes of exile.  However, what grabbed me was the way that Mahlberg describes these topics as a “newish and still only patchily explored field”, and her judgement that the book “posed many important questions, successfully answered some, but also left enough for the rest of us to puzzle over”.  Mahlberg’s review is not a question of whether Major’s ‘interpretation’ of exile was ‘correct’ or ‘accurate’, rather she seems to suggest that history is a joint-venture between countless eyes and hands, all of which build on each others’ work.  Not only that but, as there will always be new fields to be explored, the work of history cannot be finished.  I’d add that old fields can often contain new surprises.


Teacher Dashboard and Google Classroom #28daysofwriting

I used to think that ICT would ‘transform’ education, and that it could also ‘transform’ society.  Well, perhaps it will, but it hasn’t yet.  As I get more experienced it seems to me that ICT, like any tool, has its benefits and its downsides.  It also seems to me that one of the big problems with the use of ICT in learning is that students quickly learn to game whatever system they have been asked to work with, and that this works in the directly opposite direction of my main aim as a teacher.  I want students to slow down, to get caught up, to be forced to think again. They want a high score, or to get to the end, or simply to be finished, on to the next thing.  Even if they get beyond this, often they want what they’re doing to be ‘good’ (or sometimes ‘good enough’). ICT can make all of this far too easy.

That’s why I tend to use less ICT directly in the classroom than I used to, and when I do I always try to ask myself ‘why am I taking the extra time to do this using ICT?’ or ‘why are we learning this in the ICT suite instead of our normal classroom’.  Sometimes I can’t find a decent answer to this question, and then we go back to the classroom and to books and pens and pencils.

In the past I have used classroom blogs a great deal, and know colleagues using them to great effect – Alan Kydd’s for instance.   However, sometimes I don’t want a public blog for my class, for instance.  I want to know who is reading it, and I don’t want to worry about the administration of usernames and privileges.  What I do want is a quick way of getting information, links and assignments to students.  Previous experience with various VLEs has taught me that this can be an enormous pain in the bum, and that the difficulties that these things represent can quickly sap the energy from efforts to use ICT to help teacher/student communication.

Recently I’ve been looking at Google Classroom, which does seem to offer me some quick and easier solutions for the problems I have run into whilst using classroom blogs.  Classroom isn’t transforming my practice, but I am finding it useful for the usual things like homework reminders and answering queries from students.   However, what I like it for best is for fleshing out those throwaway remarks, or passing conversations we have with students who are interested in topics not directly related to our syllabus.  Links to extra reading, radio or TV shows, catch up notes and historical novels that we have discussed.

Teacher dashboard is a set of apps with even more potential, which I’m still experimenting with.  This service from Hapara gives you the ability to create a folder in your students’ google drive (not their personal drive, but one connected to their institution), and to send them google documents and other resources.  Using the dashboard I can then tell which students have amended their documents, and when they did so. I can also give them feedback on their work as they progress.   I’ve been using this with some year 10 GCSE students. Their assessment in 2016 will be on paper, so I’m reluctant yet to spend a great deal of time asking them to type answers into google docs.  What I have been using it for is revision presentations.

I have been asking students to go home after each lesson and make two or three slides to record what they learned in each lesson.  In this way I’m hoping that I can encourage them to see that revision shouldn’t be something that happens at the end of a course, or just when you have an important assessed test coming up.   In trying to use something I learned from making it stick – that effort expended in trying to remember something will help later recall – I ask the students to first draft their slides without looking at their notes.  When the first draft is done, then they should make the notes.  We have a short formative assessment every month or 5 weeks, and they hand in a printed version of their revision presentation as the test starts.

I can’t honestly say that this has yet had a huge impact on grades. I have noticed that their retention and use of important information has improved.  What it is doing is setting up a routine and expectation that revision is ongoing.  I also get an example of what they do when they revise, and I’m going to use this to help them revise better as the course goes on.

So, Classroom and Teacher Dashboard is ICT that isn’t revolutionary, but is genuinely helping me in my task of enabling students to learn.