Trithemius, and how not to revise

One of the best school assemblies I have heard during my time as a teacher concerned the subject of revision: how to do it, how to organise it, what works well, what doesn’t. It was an interesting subject for me for several reasons. At school I often found it difficult to sit still and concentrate, alone with my files and books: revision, in fact, was an activity I never felt I quite mastered (and this wasn’t simply a case of normal teenage activities feeling more interesting than sitting down for a couple of hours with a chemistry textbook). My technique, I was quite aware, wasn’t right.

As a teacher now myself, pupils will often ask me for advice on how best to revise, and of course I try to offer helpful suggestions. This involves rehearsing conventional wisdom about the importance of working in manageable chunks of time, with regular breaks, in accordance with a clear plan. I suggest that pupils concentrate, in particular, on revising what they’ve struggled with in the course of regular term-time study, and that they take the opportunity to ask teachers for extra resources they can use where relevant. I also suggest that attempts be made to talk things through with friends and/or family, so that revision doesn’t end up as a lonely experience. Vocabulary revision and key term learning, I suggest, benefits from this process.

Anyway, one of the highlights of the excellent assembly I heard was its address of a particular topic: the perils of just copying material straight from a textbook or from your notes, just to repeat it, verbatim, on a separate sheet of paper. This very common method, we were told, is a particularly unhelpful way of doing things. Well, I have to confess: this exact method was one of my main ways of doing revision when I was at school… and certainly it did always seem a bit – well – slow and ineffective.

What, then, to do instead? You’re better off, we were told, spending a minute or two casting your eye over a given page, then – from memory – trying to write out everything of importance that you can recall (before then cross-referencing it against the original). Then repeat if necessary. This way you can quickly build up your memory of the main take-homes on a given topic. In addition, you should develop your own questions and ideas relating to what you’re studying and use these to build up a sequence of questions and answers on a given subject. When it comes to straight knowledge acquisition (brute, fact-based learning for e.g. Biology or Chemistry exams), these seem to me excellent recommendations.

The memory of my old, obviously inefficient, approach to revision returned to me over the past few days in a quite separate context, as I was reading about the thought of the renaissance humanist scholar, the Benedictine monk and bibliophile Johannes Trithemius. In the words of the historian Anthony Grafton, whose brilliant book Worlds made by Words: Scholarship and Community in the Modern West I have been reading, Trithemius was a famous German abbot and spiritual counsellor who had to spend ‘the last 15 years of his life dealing with the accusation that he was a magician who employed diabolic help’. Just the sort of character I like to encounter in my bedtime reading.

What struck me most as I learned about Trithemius was his idea of the benefits of scribal activity, his sense of what just copying out a text (in a way analogous to what I had done as a revising teenager) can do. The monk who spends his time copying out holy books, Trithemius thought, ‘will not be burdened by vain and pernicious thoughts, will speak no idle words, and is not bothered by wild rumours’. Instead this monk will be ‘gradually initiated into the divine mysteries and miraculously enlightened’.

A copy of Trithemius’ De regimine claustralium

Sure, I had never copied out ‘holy books’. But still: it was interesting to see that the monastic discipline of copying texts was more than a means of just preserving the texts in question. It was a way, Trithemius thought, to gain access to the divine: ‘every word we write is imprinted more forcefully on our minds since we have to take our time while writing and reading’, he wrote.

The ‘imprinting’ Trithemius had in mind was not a simple matter of rote learning, then: it was of a training in spirituality, in ethics, and in theology. But knowledge acquisition clearly mattered too: copying the texts was a method of instruction, after all. Trithemius was no expert, obviously, on the cognitive psychology of test preparation (a topic on which teachers today can expect to be lectured). Naturally the approach he recommends wouldn’t have led to optimal preparation for 21st century school examinations.

The abbot was articulating his ideas at a time when the recent spread of the printing press across Europe had started to exercise a dramatic impact. The world of the monk copying texts in a cloister would soon be shaken to its core, not just by the printing press, of course, but by the 16th century reformers and their challenges to monasticism itself across western Europe.

Monks in a cloister

Trithemius was vigorously opposed to the use of printing presses: he saw them, among other things, as a recipe for idleness in monastic communities. The vision he cherished was of a Benedictine world in which ‘brothers do not spend their time in idleness, but practise the work of their hands’, i.e. by copying out texts. And there was a role for every member of a community here: while some would copy, others would bind the books in written codices, others would correct them, and others would rubricate them. This, he thought, is what ‘holy labour’ looks like.

This, then, was a world in which the copying out of texts wasn’t simply a dud revision technique, but an activity which might give life and shape to a community, an activity which could even be conceived as a way to achieve spiritual development.

From a present-day vantage point, it is interesting to me that the reading of texts, now, in the present (whether the texts are literary, religious, self help or otherwise) is still widely held to offer the same kind of potential benefit Trithemius attributed to copying. People can grow and develop emotionally through their reading: this much is still widely believed – and uncontroversially so – in the 21st century.

But, by contrast, the copying or writing out of a text is – now – not assumed (to my knowledge) to carry any similar benefits. Trithemius’ world of the text is not ours.

A very happy 2021 to all readers of this blog.

On the A level fiasco

A level results day yesterday was a very unusual day. There has been a lot of comment, a lot of confusion, and a lot of ‘noise’. It’s not yet clear whether we’ve seen the final word on the subject from the government: it’s still possible that last minute amendments could be made to the process for generating grades, perhaps to parallel what has already happened in Scotland.

In this post, I want to question some features of the coverage of the day itself in newspapers and other media outlets, the nature of some of the responses I’ve seen to the headlines, and also to question some of the decisions the government and Ofqual the regulator have made. Suffice to say, the quality of comment that’s out there has been variable. Some thoughtful, lucid and careful analysis has appeared; other analysis has been crude, simplistic and misleading. The need for the former type of analysis is vital in a situation where a large number of young people find themselves confronting a really upsetting and difficult situation and (understandably) want answers.

_methode_times_prod_web_bin_44c42c16-0c9e-11ea-800a-ac7a27c0f5a9

The initial decision

Some words first on the government’s initial decision, back in March, to cancel the summer exams. From this everything else has flowed.

Certainly there were those (including me) who felt that this was a decision that needn’t have been made. In the recent words of Vicky Bingham, head of South Hampstead High school, ‘we could have managed exams. With a bit of imagination and planning we could have done it’. This reflects the thoughts of numerous teachers and leaders in education back in March.

In the place of actual exams, the government asked schools to produce teacher-generated grades (or, to use the jargon, CAGs – centre assessment grades). These were then to be subjected to a process of review and moderation by exam boards (overseen by ofqual, the exam board regulator).

img

Ofqual and the government could have offered some really clear advice to schools about how it would conduct the process of reviewing and moderating grades that schools submitted. Perhaps they could have done so along the lines suggested here. They didn’t.

A further crucial intervention that could have been made at this stage would have been to try to prevent a situation where pupils would miss university offers by dint of others’ best guesswork. Could special arrangements of some sort have been made for pupils in this situation? If so, what? One possibility would have been to ask universities to defer places for pupils in this situation until they’d had a chance to sit actual exams, either in November or next June. Or to ask them to hold open as many places as possible, regardless of exam outcomes. If this did happen, no one has heard of it.

It’s pretty clear to me that not enough thought went into trying to look after the interests of pupils in this category, to try to find something like a ‘fair’ route forward for them.

CAGs, Ofqual’s algorithm and Grade Adjustments

School-generated grades were always going to be a mixed bag. Some schools achieve very similar results year-by-year. Other schools can expect quite different results, depending on the cohort of students they have. Some schools have smaller year groups (where bigger year-by-year shifts are quite likely); some don’t.

Equally, some teachers and institutions – for various reasons – will have been quite generous with their pupils’ CAGs; others won’t. On the whole, generosity seems to have been more common. Back in July, Ofqual revealed that schools had submitted grades that were 12 percentage points higher than last year’s grades.

Some form of control on what was happening would be necessary, then, if we were not to see significant grade inflation. Or, to frame the matter in a different way, if we were not to see some pupils benefit from having generous/optimistic teachers, where others had had no such privilege.

Ofqual’s plan in dealing with the grades gathered from schools was to use statistical analysis and a specially configured algorithm with the data they had received. Crucially,  schools’ exam performance in previous years would be factored into the analysis and used to generate a similar batch of results for the present year.

Ofqual’s use of its algorithm is described, with clarity and in detail, here in (of all places) the Daily Mail. What we have ended up with, as a result, is an overall picture that looks pretty similar to the picture we’ve seen before in recent years.

The risk of sudden grade inflation, then, has been avoided. But the changes needed to achieve a familiar-looking picture have been considerable, and they have created needless upset.

An article on the BBC website reports how CAGs as a whole have been affected by ofqual’s interventions. The graph below shows the overall picture and speaks for itself:

_113922571_optimised-ofqual_change_chart-nc

A lot of press reporting has focussed on the fact that high-performing (outlier) students in schools where results are mostly low or average have been badly affected by the (mainly downwards) adjustments that have been made. If you are at a school whose pupils rarely achieve top grades, and your teachers have (justifiably) predicted you e.g. straight A*s, you are less likely – once an adjustment is made by Ofqual’s algorithm – to receive your teacher’s predictions than a pupil at a school where results are very often high. There is a nice case in point of this precise situation here.

To compensate for this sort of situation, Ofqual could have created a clear, straightforward appeals process, as suggested by Sam Freedman in this excellent Twitter thread here. The government could perhaps even have made special arrangements for such pupils to sit their exams in June or July. It didn’t. Instead, it’s been left to universities to pick up the pieces, making ad hoc judgments about the students who find themselves in this situation. Some students, inevitably, have been left disappointed. And some universities/courses/colleges have been more generous than others in responding to the situation.

For the vast majority of students who have experienced it, being downgraded must feel a very bitter pill to swallow. Students have been able to find out the grades their schools submitted on their behalf. For those who have not received those grades, especially if this means missing out on a university place, I feel incredibly sorry. My big feeling is that all the problems we have seen could have been obviated earlier in the year if the government had simply been determined that, come what may, all pupils would sit their exams.

How pupils have been affected: by school, by background

A further feature of the press reporting of the A level debacle has focussed on differences in outcomes between pupils in different types of school, and of different socio-economic backgrounds. Clearly there are some slight differences in the likelihood of having your teachers’ grades changed, depending on these factors. But to talk of ‘bias’, as some commentators have done, in my opinion profoundly distorts the issue.

The table below illustrates the issue in relation to school type:

EfSZW4xWsAA8Rvr

The first table summarises adjustments made by school type. A big part of the commentary claiming there has been ‘bias’ toward independent schools in Ofqual’s policy has hinged on the statistic highlighted in yellow: 4.7% more A/A* grades were awarded to independent school pupils this year than last. There are several things to say here, and perhaps I’m particularly inclined to say them as someone who works in the independent sector.

First, that the independent school sector is diverse. Inevitably some schools will have done very badly out of what’s happened, and their statistics won’t reflect the overall trend. Others will have done well. A school that sees large variations in year-by-year performance, with a particularly good year group this year (with GCSE results to match), for example, may not have done well out of Ofqual’s review process. Put simply, many pupils at independent schools won’t have been advantaged by what has happened this year at all.

Second, although 4.7% seems a clearly higher figure than the other figures in the table, when seen as a proportion of A*/A grades awarded, it’s in fact not much out of keeping with the other figures. It represents an 11% increase in A*/A grades. For secondary comprehensive pupils there was a 9% increase in these grades, for selective secondary schools it was 3%, for sixth form/FE it was 1.4%, for academies it was 7% and for ‘other’ institutions it was 13%. So independent school pupils have done slightly better on the whole, but the 4.7 figure isn’t the outlier it might seem in the table when the statistics are seen in proportion.

Third, as some in the commentariat have pointed out, independent schools are more likely to have small class sizes than other types of school. For small class sizes, Ofqual has said that there is a problem applying its algorithm: the data pool is too small. As a consequence those with small class sizes have been more likely to receive their teacher-allocated grades. A further word here on the existence of these small classes, which some commentators have described as sites of privilege, of advantage. Well, another way of seeing them is just as places where subjects which are not currently as faddish as the most popular A level choices are being taught.

Fourth, it may just be that teachers in some schools – e.g. selective secondary schools and Sixth Form/FE colleges – submitted CAGs that were less optimistic for their pupils. If so, Ofqual should have adjusted for this more than they have done.

Below is a summary of outcomes for pupils of different socio-economic backgrounds:

EfUjEq3WsAM7mXQ

The bullet points above the table are worth reading. On the face of it, it perhaps looks as though students of ‘Low SES’ have been more harshly treated: 10.42% of their CAGs have been downgraded, as opposed to 9.49% of medium SES and 8.34% from high SES backgrounds.

However, a) teachers of low SES pupils, as the table says, are more likely to over-predict (this can be borne out from historical data); b) they have actually achieved more highly this year (74.60% achieving C or above) than in either 2018 or 2019.

It has been a similarly good year for those in the Medium SES category, 78.20% of whose CAGs have been awarded C+, a better statistic than for either 2018 or 2019.

Only those in the High SES category have done worse than was managed in 2018, though they have done better than was managed in 2019.

So, when looked at carefully, ‘bias’ here, or in relation to school type, is difficult to detect. Of course, it ought to go without saying that ‘bias’ toward one or another type of school is something it would be bizarre to find a regulator doing.

The individual and the system

From my point of view, as I’ve said above, the real argument about these A level ‘results’ ought to have been happening in March. It’s all very well Keir Starmer today condemning the ‘disaster’ of this year’s results (and he isn’t wrong), but where was he earlier in the year, when he should have been pointing out the entirely predictable problems we’ve seen?

For a certain type of newspaper columnist, and for a certain type of twitter commentator, everything we’ve seen over the past two days points to ‘systemic bias’, to social inequality, and to entrenched injustice. ‘The whole [exam] system’, as one senior academic puts it, ‘is not to recognise individual merit or attainment, but to maintain its own credibility in the eyes of big business and pundits, and to sort the population according to predetermined criteria and maintain the status quo’. Some would say that’s a quite extraordinarily cynical way of seeing things.

But a less extreme version of this strand of opinion has been widely endorsed by numerous public figures. This claims simply that this year’s results have been ‘biased’, or that, in the words of the historian Peter Frankopan, the results have been ‘devastating for social equality/mobility’. Or that, in the words of Andy Burnham, ‘the government created a system which is inherently biased’ against FE and Sixth form colleges.

In my view, when looked at carefully, the tables considered in the section above highlight the obvious problems with perspectives like these, where the only ‘bias’ I can detect is in the attempt to seek broadly to replicate the results of previous years.

What is clear to me is that an opportunity has been seized to make some political capital out of the situation, by suggesting that the government has been biased against those of lower socio-economic backgrounds. On this occasion, though, the data simply doesn’t bear that out. And it’s almost as though some people are so intent on foisting their social theories and political assumptions on the data that it doesn’t really matter how badly those theories and assumptions actually tally with the facts.

A more productive way forward, surely, would be to emphasise that this has been a fiasco that has touched many people – pupils, teachers, parents, schools, universities – across the whole education sector, and across all sections of society; that very many pupils have been adversely impacted through no fault of their own; that proper plans and support should be put in place for those pupils who want to sit their exams; that now is not the time to play political football with young people’s lives and aspirations, but to try to respond to the situation without reaching for extreme judgments.

It’s good to see, for example, that Worcester College Oxford has promised to fulfil its offer of places to all who received one. I wonder about the practicalities/good sense of all universities/colleges doing this, but it’s certainly a very decent gesture.

Meanwhile, if there is to be a wide-ranging and trenchant critique of the socio-economic forces at play around A level assessments, it would be nice to hear it done thoroughly, accurately, and at a really opportune moment: like, for example, in March.