Weapons of math destruction: How big data increases inequality and threatens democracy

A review of 'Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy' by Cathy O'Neil (Allen Lane, 2016, 272pp, _12.99)

Ken Pease

The age of the algorithm, with its bedfellows machine learning and artificial intelligence, is upon us. The road to the algorithm is routed as follows: first, a machine is let loose on massive sets of data including the thing to be predicted (the presence of a disease, academic success or the chance of defaulting on a mortgage); the 'trained' machine then 'knows' what factors, and how combined, predict the outcome in question; this is then enshrined as a predictive formula, the algorithm. On 28 February, the House of Commons Science and Technology Committee launched an inquiry into the use of algorithms in decision making, noting: 'Algorithms are being used to make decisions in a growing range of contexts. From decisions about offering mortgages and credit cards to sifting job applications and sentencing criminals, the impact of algorithms is far reaching.' But what are the problems? The book reviewed here suggests the following:

1. Where the outcome variable is under human control, the algorithm reflects the prejudices of the decision maker. For instance, if reconviction is the outcome variable, attributes of those who are policed most closely will feature in the algorithm. If the police take against people with red hair, attributes associated with having red hair (for example being of Irish extraction) will be in the algorithm.

2. Those who know what is in the algorithm will be able to 'game' the system, by spuriously manipulating the key variables.

3. Not knowing how the algorithm works will lead to palpable injustice.O'Neil gives examples of each problem, and of the Catch-22 twist linking the second and third problem. She tells of an excellent teacher whose poor assessment by the 'algorithm gods' cost her time and distress and her job. It turned out that her classes had spuriously high attainment scores because of cheating by their previous teachers. This led to students' progress while in her care being understated and ascribed to her incompetence. Yet exactly how the algorithm was calculated was withheld from her, on the basis that, were it transparent, she would be in a position to adjust it to her advantage. The algorithm can either be ineffable or manipulable. It cannot be neither!

O'Neil's book is dedicated to 'all the underdogs' because she contends that algorithm use tends to disadvantage the underdog. Her full argument for that view is well worth considering.