The possible fallout from cancelling exams for teachers

Standard

 

There will be many consequences in education due to Covid-19. It’s not possible to even predict what they will be at this time when we don’t know how long schools will be closed.

However, one thing we do know, is that the GCSE & A Level exams have been cancelled and the DfE announced on Friday how student grades will be calculated. This blog will be specifically focusing on the potential fallout of this for me as a classroom teacher. As with anything, depending on your view of assessment and accountability, we will have differing opinions on whether these consequence s might be a good or bad thing. I will give my opinion on what I think.

In year data collection

We were just getting to a stage, where most schools were reducing the amount of times that staff have to enter data onto whole school systems, reporting student progress/attainment. Some schools may choose to use this data on their students to help generate a grade to submit to the DfE. However, this is potentially problematical. Depending on the school/teacher, this data was never designed to be used as a predictor of the actual grade a student may get. If it is used internally to monitor or judge teachers they may be overinflated. If they’re used with students as a motivator they might be under estimated to encourage a student to work or slightly overestimated to encourage an unconfident student to carry on.

A positive of this is that schools may look at the process used for teachers to calculate what they are entering into the system. The downfall of this is that leaders may only be using it for the end results rather than the journey of getting there. For me this data, is part of a discussion within our department and with my line manager. Not just an exam grade predictor tool. Will leaders reintroduce more data collection for the possibility that this might happen again? Let’s hope not.

img_4349

Mock exams

These have been a focus of discussion amongst teachers. Some even believed that the DfE would get us to send student mock exams to exam boards to mark or for moderation. One person insisted that the rule is that papers have to be locked up after students have sat them, for unseen circumstances. These people clearly have no idea what other people do regarding mock exams.

I tweeted a long list of problems with using mock exams to generate predicted grades:

  • Students sat different papers to generate a grade
  • Grades were calculated using different boundaries
  • Some subjects haven’t done mock exams
  • Mock papers have gone home
  • Predicted grades could have been manipulated for….
    • motivation
    • course entry
    • assumed work student will do
    • parental/student pressure
  • All teachers aren’t exam board trained in accurate mark scheme marking
  • You can’t apply a previous set of boundaries to a different cohort
  • Teachers may have been intentionally harsh/lenient in their marking
  • Teachers have a vested interest in positive results (linked to performance management/pay)
  • Teacher predictions are generally poor

Following this year, my concern is that school leaders might change how mock exams are used in schools.

I strongly believe that all work students do is formative, until the final exam. They should have opportunities to learn from their work and be able to look at them as part of their revision notes. Locking them up in a cupboard stops this.

I don’t use marks/grades with students throughout the GCSE. I’ve blogged on my thoughts on data, many times….

I really hope that due to this leaders don’t start doing things like creating spreadsheets of made up data to ‘track’ student ‘grades’, make us grade pieces of work that are upgradable e.g. one exam question, choosing the paper that students do as a mock instead of leaving it to the HOD, making me mark using marks/grades, using previous grade boundaries to apply to anything a student writes (grade boundaries don’t work like that!)….. all of these things don’t improve learning, they take up teacher time to try to calculate something that can’t be accurately calculated.

Weighing the pig doesn’t fatten it but might some school leaders think that students need to do more exams to give more data? I personally don’t have a problem with students practising but when it’s formative, not summative. Those that love a ‘walking, talking mock’, would spend hours of year 11 with students in a hall, instead of thinking about the long term learning of a student.I would be mightily annoyed if we have to set more exams, mark them and then not be used for learning…..

Exam board predicted grades

Those of you that have been teaching for a few years will recognise these. We used to have to complete them a few months before the exams to predict which grade we think a student will get. These were then sent to the exam board in the belief that if for any reason a student didn’t sit the exam, this is what the board would use. However, without any communication that I remember, they stopped.

In my own experience of these coming into use, I once had a student that broke their arm on the day before the exam and was too ill to complete the exam. I had predicted them a ‘C’. They were given an ‘F’. Why did this happen? Well, someone on Twitter pointed me to this research by OCR, at about the time general OMRs were ditched.

Unsurprisingly, teachers were more likely to be optimistic than pessimistic with grade predictions.

UCAS statistics show “In 2019, 21% (31,220) of accepted 18 year old applicants met or exceeded their predicted grades, a decrease of 3 percentage points. In addition, 43.2% of accepted applicants had a difference of three or more A level grades”. What this tells us, is that when the stakes are high, predicted grades are high!

So, the high stakes nature of teacher prediction makes it unreliable. Asking teachers to do it more won’t make it any more reliable, in fact if it is high stakes it gives teachers more reason to be generous. Will exam boards bring these back? Doubtful. If they do, it’s more teacher work.

The positives

Maybe there will be some positives to come from this unprecedented (Yes, I said that word) situation. My hopes would be:

  • Teachers learn more about how exam boards generate exam grades and how papers and grade boundaries are linked
  • Consider our use of internal school data more carefully, in many cases not relying on mock grades but the student’s whole GCSE/A level experience
  • Ditching revision and teaching it well
  • Considering the whole curriculum and how it feeds into KS4/5
  • Discussing students individually, not focusing on data output from a spreadsheet
  • More leaders realising that predicted data isn’t always reliable

I really hope that we don’t go backwards, due to this hopefully once in a lifetime event…..

One thought on “The possible fallout from cancelling exams for teachers

  1. Pingback: The hardest Covid19 decision for the Irish government is the State exams. – Peter Lydon

Leave a comment