Home Education If Only John Naughton Understood Anything

If Only John Naughton Understood Anything



John Naughton wants us all to understand that algorithms are a terribly dangerous threat to society, democracy and are, in fact, the distillation of puppy dog tails.

Oddly, he seems to know little about the subject, rather writing upon autopilot – even, by algo.

Will Thursday 13 August 2020 be remembered as a pivotal moment in democracy’s relationship with digital technology? Because of the coronavirus outbreak, A-level and GCSE examinations had to be cancelled, leaving education authorities with a choice: give the kids the grades that had been predicted by their teachers, or use an algorithm. They went with the latter.

The outcome was that more than one-third of results in England (35.6%) were downgraded by one grade from the mark issued by teachers. This meant that a lot of pupils didn’t get the grades they needed to get to their university of choice. More ominously, the proportion of private-school students receiving A and A* was more than twice as high as the proportion of students at comprehensive schools, underscoring the gross inequality in the British education system.

Hmm, well, teachers are known to overestimate:

Universities and Colleges Admissions Service (UCAS) research comparing predicted grades sent as part of the university application process with final grades found grades were accurately predicted 42% of the time, with 48% of predictions being optimistic.


Research based on UCAS data looking at predictions for pupil’s point score based on their three best A-Level results – typically what university offers look like – found just one in six (16%) were spot-on.

However, three-quarters of scores were overestimated (75%), and just 9% were lower than the student eventually achieved.

The algo’s looking pretty good actually.

The proportion of children educated in the independent sector winning the highest grades at A-levels is now at its lowest level since 2010, when the A* mark was introduced.
The proportion of private school students achieving A* and A has decreased by 6.3 percentage points over the past decade, from 52 per cent to 45.7 per cent, according to data released by the Independent Schools Council (ISC).
Meanwhile, the national proportion of students winning top grades over the same period has gone down by 1.5 percentage points, from 27 per cent in 2010 to 25.5 per cent this year.

Here it’s clearly wrong as it is underestimating the superiority of schools where teachers give a damn.

However, there’s a much more fun and deeper point here. Those screaming about algorithms tend to be those from the left side of the political aisle. which is odd. Because deciding things by algo is planning and it is the left which insists that society can and should be planned. So, why the complaints? The entire wet dream is that we individuals become just the obedient components in the planned society after all.

The answer is, I think, because there’s a certain unease at what the algos actually say. They do not – as with the exam results where the desire is that all should have prizes – come up with the desired answer. And that’s because the algos are not the product of planning themselves. Instead, they grow by machine learning. That is, they look at lots and lots and lots of what humans actually do. Then they make a decent guess at what humans are going to do subject to those same incentives or events.

Not, note, what humans should do nor what it is desired that they do but what we actually do do. That is, algos informed by machine learning are really the codification of markets. If you prefer they are wired by the revealed preferences of actually existing humans, not the expressed an most certainly not the expressed preferences of those who would rule us. Which neatly explains why it is those who would rule and plan us who seem to be getting upset.

Algorithms for decision making are reaching the wrong answers, d’ye see, by doing what people do. Given that the entire project is to change what people do, in the absence of which change the people doing the doing, we can see the frustration.

Algos are markets, not planning, that’s the complaint.



  1. An algorytm is simply: take this information, apply this method, tjis is the result. That’s why they want algorythms banned, they want to ban people applying methodolical processes to the real world. 2 + 2 is 5 because we say so, stop tjinking back, do as you’re bloody told, do as ***WE*** tell you.

  2. The algorithms probably have a better idea as to who would gain the right A levels than the teachers, but the only fair way to do it for individual students was to keep the schools open and hold examinations.
    If that was judged too dangerous then closing the schools and abandoning examinations should have been accompanied by suspending university education for new entrants for next year.
    Stopping production in mind tenths of a production line necessitates stopping the other tenth.
    It’s high time someone stopped giving the education establishment everything it wants.

  3. Was the algorithm derived from five or ten years previous predicted vs tested results by school? Should be simple enough to determine the trends and apply them to the current students. Of course there will be some anomalies but those exist anyway where a few students wildly over- or under-perform on performance or testing.

  4. Algorithms are generally only as good as the people who code them. Sure, you have self learning algorithms but they can have biases introduced (consciously or otherwise – see Google and it’s photo recognition algo’s classifying black people as gorillas) from the training data. So in the end it comes back to people to see if the predictions line up with the expectations.

  5. MrYan,

    Obviously dodgy as black people aren’t as nice as gorillas – but then you can’t buy drugs from a gorilla. On the other hand, gorillas don’t tun amok stabbing people, and generally they grow up in family units where the father sticks around and takes care of them.

    You can’t trust those algorithms.

    PS. Gorillas can really play the drums after eating Cadbury’s chocolate – perhaps the bar that Mr Thompson didn’t throw back.


Please enter your comment!
Please enter your name here


in British English
expunct (ɪkˈspʌŋkt)
VERB (transitive)
1. to delete or erase; blot out; obliterate
2. to wipe out or destroy

Support Us

Recent posts

Expunct comes of age (sorta)

Today is the proper one year anniversary of the launch of expunct. It's been a rollercoaster but we wanted to create a site to...

We Can Help Salon Out Here Over Abortion And The Biden Administration

It's entirely true that abortion is one of those difficult questions. It's even true that the answers rather divide Americans. However, it's still possible...

Nick Dearden Really Is A Ghastly Oik

Dearden is from Global Justice Now - the usual bunch of Trots who never quite have left mother's basement. Their political views haven't advanced...

So Here’s The Actual Complaint About Amazon’s Diversity

It's possible that Amazon is simply packed full of thuggish racists who delight in keeping the poor folk down. Possible, even if perhaps a...

Government Health Care Causes Corruption

This isn't what Transparency International quite means to say here but it is indeed what they are saying. Government provided health care leads to...

Recent comments