Anyone concerned about how inequity is perpetuated by seemingly ‘neutral’ or ‘scientific’ processes.
In a nutshell:
Data scientist O’Neil explores what she calls WMDs, or Weapons of Math Destruction – large algorithms that are largely opaque and control aspects of our lives, from college rankings and admissions to credit scores to voting. She argues that these systems are flawed and have biases built in that harm all of us.
“The human victims of WMDs, we’ll see time and again, are held to a far higher standard of evidence than the algorithms themselves.”
“A model’s blind spots reflect the judgments and priorities of its creators.”
Why I chose it:
Seemed appropriate given the recent A-level shitstorm we’ve lived through in the UK.
Every August in England, 17- and 18-year-olds find out their A-level scores. Unlike in the US, where basically unless you royally screw up in the final term of your senior year you are going to the University you were accepted to in March, in the UK students receive conditional offers. Let’s say you want to go study Chemistry. Well, at a top school, you might receive a condition offer of AAA – meaning you need As on three of your A-levels (the best mark is an A*), and one of those will need to be Chemistry. Okay, so come mid-August, you go to your school and learn that you received … AAA! Hurrah! You confirm your place at university, and start the following month.
This year, because of the pandemic, A-level exams were scrapped. Instead, the government put together an algorithm that was meant to sort out what grades students would have gotten had they sat their exams. It was based on a few things, like practice exams, coursework, etc. It also, apparently, took past performance of the school a student attended into account.
Do you see where this is going?
On results day, tens of thousands of students received A-level results DRAMATICALLY lower than what they had been predicted to get. And the general theme was that those lower scores were received by students in areas with overall poorer performing schools. Students were essentially punished by the algorithm for doing too well, and had their places in university pulled out from under them, upending their entire futures. In the end, the algorithm was scrapped, students were put through horrible stresses, and universities now have more students than they would have, in the middle of a pandemic.
I share this story because I can see it making its way into this book during the next revision. O’Neil is a great writer, making a book that could have been dry and confusing extremely easy to read and engaging. It’s also infuriating,
She looks at things like credit scores being used to rule people out of jobs, at recidivism models used in sentencing in the criminal punishment system, and even the college rankings in US News and World Report. She also touches on how Facebook and Google create profiles using all the data they have, adjusting their targeting accordingly.
She refers to algorithms as ‘opinions formalized in code,’ and that’s especially frightening considering how many people view such algorithms as value-neutral and just ‘showing data.’ The negative impacts – generally borne by people who are poor, or aren’t white – are seen not as self-perpetuated by the models themselves, but as moral failings of the individuals who are judged by these flawed systems. Its insidious.
It seems inescapable, but O’Neil does offer some suggestions at the end, and they don’t seem entirely out of the realm of possibility (GDPR, which is law in the EU, is one fix, and it passed). But man, it’s yet another thing that our society needs to fix.
Keep it / Pass to a Friend / Donate it / Toss it: