John Naughton wants us all to understand that algorithms are a terribly dangerous threat to society, democracy and are, in fact, the distillation of puppy dog tails.
Oddly, he seems to know little about the subject, rather writing upon autopilot – even, by algo.
Will Thursday 13 August 2020 be remembered as a pivotal moment in democracy’s relationship with digital technology? Because of the coronavirus outbreak, A-level and GCSE examinations had to be cancelled, leaving education authorities with a choice: give the kids the grades that had been predicted by their teachers, or use an algorithm. They went with the latter.
The outcome was that more than one-third of results in England (35.6%) were downgraded by one grade from the mark issued by teachers. This meant that a lot of pupils didn’t get the grades they needed to get to their university of choice. More ominously, the proportion of private-school students receiving A and A* was more than twice as high as the proportion of students at comprehensive schools, underscoring the gross inequality in the British education system.
Hmm, well, teachers are known to overestimate:
Universities and Colleges Admissions Service (UCAS) research comparing predicted grades sent as part of the university application process with final grades found grades were accurately predicted 42% of the time, with 48% of predictions being optimistic.
Research based on UCAS data looking at predictions for pupil’s point score based on their three best A-Level results – typically what university offers look like – found just one in six (16%) were spot-on.
However, three-quarters of scores were overestimated (75%), and just 9% were lower than the student eventually achieved.
The algo’s looking pretty good actually.
The proportion of children educated in the independent sector winning the highest grades at A-levels is now at its lowest level since 2010, when the A* mark was introduced.
The proportion of private school students achieving A* and A has decreased by 6.3 percentage points over the past decade, from 52 per cent to 45.7 per cent, according to data released by the Independent Schools Council (ISC).
Meanwhile, the national proportion of students winning top grades over the same period has gone down by 1.5 percentage points, from 27 per cent in 2010 to 25.5 per cent this year.
Here it’s clearly wrong as it is underestimating the superiority of schools where teachers give a damn.
However, there’s a much more fun and deeper point here. Those screaming about algorithms tend to be those from the left side of the political aisle. which is odd. Because deciding things by algo is planning and it is the left which insists that society can and should be planned. So, why the complaints? The entire wet dream is that we individuals become just the obedient components in the planned society after all.
The answer is, I think, because there’s a certain unease at what the algos actually say. They do not – as with the exam results where the desire is that all should have prizes – come up with the desired answer. And that’s because the algos are not the product of planning themselves. Instead, they grow by machine learning. That is, they look at lots and lots and lots of what humans actually do. Then they make a decent guess at what humans are going to do subject to those same incentives or events.
Not, note, what humans should do nor what it is desired that they do but what we actually do do. That is, algos informed by machine learning are really the codification of markets. If you prefer they are wired by the revealed preferences of actually existing humans, not the expressed an most certainly not the expressed preferences of those who would rule us. Which neatly explains why it is those who would rule and plan us who seem to be getting upset.
Algorithms for decision making are reaching the wrong answers, d’ye see, by doing what people do. Given that the entire project is to change what people do, in the absence of which change the people doing the doing, we can see the frustration.
Algos are markets, not planning, that’s the complaint.