

Thousands of A-level students get their “results” tomorrow. Covid-19 meant exams were cancelled. So, instead, their grades will in many cases have been generated by a hastily-built government algorithm.
We’ve got concerns about this algorithm.
Firstly, as is so often the case, there’s a lack of transparency about what this algorithm does and how it works.
This is unacceptable. Students have a right to understand how decisions which affect them are made. And algorithmic decision-making needs to be open to scrutiny.
This is unacceptable. Students have a right to understand how decisions which affect them are made. And algorithmic decision-making needs to be open to scrutiny.
Secondly, it appears the algorithm grades schools rather than students: “Where a subject has more than 15 entries in a school, teachers’ predicted grades will not be used”.
So an individual student’s life chances hang on an estimate based on their school’s historic performance.
So an individual student’s life chances hang on an estimate based on their school’s historic performance.
We’re concerned that this could discriminate against high performing students in historically underperforming schools.
An excellent student who was on track to get their school’s first A, now almost certainly won’t - because of an algorithm.
An excellent student who was on track to get their school’s first A, now almost certainly won’t - because of an algorithm.
Students at smaller schools appear to get more personalised treatment. Ofqual have said they will factor in teachers’ assessments for schools with smaller classes where under 5 students have grades submitted in a topic.
These will often be better-resourced/private schools.
These will often be better-resourced/private schools.
The Scottish system which First Minister Nicola Sturgeon binned this week appeared to have similar issues – Nicola Sturgeon has now acknowledged it wasn’t fair and pledged a reform that will take greater account of pupils’ individual circumstances.
We also think automating a major decision about pupils in this way potentially violates the GDPR and UK Data Protection Act. Those laws (e.g., Art. 22 GDPR) provide significant protections from automated decisions about people which may have significant consequences.
Foxglove has a track record of challenging dodgy government algorithms – last week we helped force them to abandon their racist visa algorithm. https://www.foxglove.org.uk/news/home-office-says-it-will-abandon-its-racist-visa-algorithm-nbsp-after-we-sued-them
For this potential case, we’re working with an A-level student from Enfield, called Curtis. He’s also set up this petition calling for a fairer system: https://www.change.org/p/boris-johnson-boris-johnson-we-need-a-fairer-system-for-this-year-s-a-level-and-gcse-students
You can see our press release about this, which we’ve just sent out and contains a bit more detail, here:
https://mailchi.mp/d41a9f90bced/legal-challenge-to-a-level-results-algorithm
https://mailchi.mp/d41a9f90bced/legal-challenge-to-a-level-results-algorithm
So, watch this space, and in the meantime, please sign Curtis's petition: https://www.change.org/p/boris-johnson-boris-johnson-we-need-a-fairer-system-for-this-year-s-a-level-and-gcse-students
Correction: Curtis is from Ealing not Enfield. Apologies to Curtis, to Ealing, and to Enfield, for this mistake.