Title: Weapons of Math Destruction
Author: Cathy O’Neil
Category: Non-fiction
Rating: 3/5
10-word summary: Algorithms increase inequality, but they can be used for good.
About Weapons of Math Destruction
Weapons of Math Destruction is a book focused on the ways mathematical models (algorithms) can increase inequality in the US. O’Neil provides some scary examples of how much damage algorithms can do.
According to her, algorithms are often used with good intentions, but the effects can still be harmful. In some cases, they create a negative feedback loop that seems to punish the poor and keep them from getting out of poverty. Such algorithms affect many things from the colleges that target them with ads to manipulate them to enroll, the number of police officers patrolling poor neighborhoods, their chances of getting hired and more.
Sadly, we often cannot see how much harm algorithms can cause and we often have no chance of understanding how they work. And the businesses that use them either cannot see the destruction they are causing or they are unwilling to stop using them because they are profitable. Still, something needs to change and the very same algorithms that do damage can be used as a force for good.
Lessons from Weapons of Math Destruction
Algorithms become weapons of math destruction when they are opaque, used on a big scale and do a lot of damage
Not all algorithms are bad, obviously. But any algorithm can become a WMD when it operates in a way users do not understand, when it can affect millions of people and when it can have a negative impact on their lives.
For example, some for-profit colleges invest a lot in advertising to target people who are poor. They lure them in with the promise that a good education (at their college) will bring them rewards in the future because they will be able to find good jobs. But sadly, these colleges have high tuition fees which only make poor people accumulate debt. And the promise of a good job is never a certainty.
This system is opaque because the people who are targeted do not know they were chosen because of their poverty. And it damages people’s lives because it makes them accumulate debt that only hurts their future.
Algorithms often cannot measure the things that matter most
In many cases, companies or institutions use algorithms to calculate probabilities and make decisions. But there are some things that cannot be measured easily: the productivity a future employee will have, the chance that someone will be able to pay back a loan or how well teachers are doing their job. So the people designing the algorithms rely on proxies or other data that may be a good indicator – or not.
When algorithms use proxies, they can end up penalizing or punishing people for the circumstances they live in, instead of analyzing people’s actions (which is what the algorithms are supposed to measure).
The algorithms are not bad per se. Often, the main problem is the objective people set
For example, if a company lends money and charges a commission, they can use ads to target poor people who need money. And they can target the people who are most likely to be in desperate need of money fast and ask them to pay high commissions. In this case, the algorithm is not the problem. It’s the business owners’ objective of making money even if this means exploiting desperate people.
The same algorithms can be used either to exploit and punish people or to help them
Now imagine you are a wealthy person who wants to do good. You can use the same algorithms to target the same desperate people and offer them loans with no commissions. Or you can help them find stable jobs so they have a chance to escape poverty. It all depends on your goals.
The technology is neither good nor bad. It can be used to punish people and ruin their lives or it can be used to help people and give them the opportunity to live decent lives. This was a very interesting and uplifting idea and I was happy to think of the possible applications that can do good.
What I like about Weapons of Math Destruction
1. It proves that algorithms are dangerous because they are opaque, unfair, damaging and can affect a lot of people
Algorithms are becoming more and more pervasive. Sadly, most people are unaware how often algorithms are used to collect their data and process it. We cannot see when an algorithm, or even a WMD, is affecting us because that’s how they are designed. But they are dangerous and can have a negative impact on our lives – without even finding out about this.
So I think it’s up to us to educate ourselves on how algorithms work and to understand how much they can affect us all. This book helps us understand a part of the problems they can create.
2. It includes many compelling cases that prove how much harm WMDs can do
O’Neil included some cases that truly made me worry about how much damage algorithms can do. These true stories prove that algorithms can affect people in different ways. They can get good teachers fired, poor people arrested for low-level crimes, poor people to pay more for loans or insurances and so on. All of these cases made me more concerned about the negative effects of algorithms and Big Data.
3. The book was well-researched
I can tell that O’Neil spent a lot of time doing research for this book. She covered many real-life cases and people’s stories in a great way. These stories were informative and eye-opening. She also explained how these injustices probably happened and what were the effects they had or could have on people’s lives.
4. It includes some great reflections on WMDs and the way they affect the society of the US
I really like the passages where O’Neil reflects on WMDs and all they entail. For example, she talked about the difficulty of making an algorithm to be fair. Fairness or justice are complex, and therefore, very hard (or impossible) to express in code, so many algorithms obviously don’t take fairness into account. She also adds that sometimes business owners wouldn’t even want to have fairness built into the algorithm because it would decrease their profits. She stated that we may need to change the way we measure success if we want to live in a society that is fair and uses algorithms. Chasing financial gain at all costs will not increase fairness and equality.
What I don’t like about Weapons of Math Destruction
1. The most interesting parts of the book were the Introduction and the Conclusion
I really enjoyed the Introduction and the Conclusion of the book. These were the parts that explained how algorithms lead to inequality and what would need to change to disarm WMDs. These are the parts of the book where O’Neil also expressed most of her thoughts on the matter of WMDs.
However, I feel like I learned very little from the 10 chapters of the book. I would even say that I may have been happy to read just a 50-page book that would have included the Introduction and the Conclusion alone since those parts were very interesting and eye-opening.
2. I did not learn much from it
I think the book did have some great information and opinions in it. O’Neil focused on some problems created by algorithms that I have not read about in other books about technology. She also included her own judgment of these problems and I loved her perspective. However, I feel like I learned little from this book.
3. The book is very repetitive and a bit predictable
After reading the first 2-3 chapters, I realized that every chapter focuses on one area that is affected by WMDs. Every one of the 10 chapters of the book shows how an algorithm can have devastating effects on some people. All these examples were very good and eye-opening, but quite predictable. It felt to me that in every situation, the causes and the effects were similar and easy to anticipate. Even if the chapters focus on different fields, it was mostly the same scenario. So honestly, I wasn’t very eager to keep reading one similar story after another.
4. Most of the book consists only of stories
I know that stories make a book much easier to read. But too many stories leave little room for actual information, facts and valuable ideas. I felt that this book focused on the stories too much. This is probably why I felt that I was learning little and the book could have been shorter.
5. This book was written about and for the people of the United States
I find that this is a pretty common phenomenon. US authors write books about interesting topics and, since they live in the US, they write for and about the US society.
However, their books are often bought by people from all around the world. In many cases, they are even translated into several languages. So I find it a bit upsetting when authors do not seem to make an effort to consider and write about the rest of the world too. And this becomes even more necessary when the author talks about a topic that can affect everyone – such as algorithms.
O’Neil wrote her book as if it was meant to be read only by those in the US. She talked about baseball, Wall Street investors, college applications and more. And to be honest, I felt a bit alienated while reading this book. After all, the WMDs created in the US may affect us all – at least, some of them can. And even if people outside the US may not use those websites, the same technology can be exported. Or they may inspire business owners all round the world to create similar algorithms. So I think this is a global problem, yet it was treated only like one that exists within their borders.
6. It’s difficult to assess how harmful WMDs actually are
The author states, at one point, that WMDs or algorithms are harmless for most people. However, they do end up having a negative or devastating effect on some people.
Here’s the tricky part – after reading this book, I still don’t know how harmful WMDs are. I do not know how many people they truly harm. Is it 1%? Is it 25%? How many of the people who use the internet to look for a college, a job or an insurance are likely to be influenced so that a business takes advantage of them?
Honestly, I have no idea. And I think this would have been very important. I would have loved it if O’Neil could have given us an estimate of the number of the people who become victims of WMDs. But I understand that this may be very hard to find out since many companies use algorithms that are like black boxes. We cannot know how they work, therefore we probably cannot estimate how many people they affect. Even so, the power of WMDs is too big not to try to find out how many people they can affect.
7. The book offers no advice on how to protect ourselves against WMDs
After reading this book, there’s a high chance that you will feel worried or scared of WMDs. You will wonder how many companies use such algorithms. And you will wonder if you have been or will be one of the people who suffer at the hands of such WMDs. So you will wonder “What should I do?”. Sadly, this book will not give you any guidance. Even though O’Neil is aware of the danger of WMDs, she will educate you on this danger and then leave you to suffer at the hands of those WMDs – if you are unlucky.
The solutions she did consider in the book – because she does have a small part about this – are better legislation and a different way of measuring success. While I understand that these are the most effective solutions for long-term results, these changes cannot happen without a lot of time and effort. In the meantime, many people are defenseless in the face of WMDs. I would have liked to read about practical advice on what we can do to protect ourselves – or, at least, try to protect ourselves. A few practices that can keep us safer would have been very helpful!
Quotes from Weapons of Math Destruction
“Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that’s something only humans can provide.”
“Our own values and desires influence our choices, from the data we choose to collect to the questions we ask. Models are opinions embedded in mathematics.”
“These are the three elements of a WMD: Opacity, Scale, and Damage.”
“While looking at WMDs, we’re often faced with a choice between fairness and efficacy. But fairness is squishy and hard to quantify. It is a concept. And computers, for all their advances in language and logic, still struggle mightily with concepts.”
“Opaque and invisible models are the rule, and clear ones very much he exception. We’re modeled as shoppers and couch potatoes, as patients and loan applicants and very little of this do we see – even in applications we happily sign up for. Even when such models behave themselves, opacity can lead to a feeling of unfairness.”
“We are ranked, categorized, and scored in hundreds of models, on the basis of our revealed preferences and patterns. This establishes a powerful basis for legitimate ad campaigns, but it also fuels their predatory cousins: ads that pinpoint people in great need and sell them false or overpriced promises.”
“Facebook is more like the wizard of Oz: we do not see the human beings involved. When we visit the site, we scroll through updates from our friends. The machine appears to be only a neutral go-between. Many people still believe it is.”
“As is often the case with WMDs, the very same models that inflict damage could be used to humanity’s benefit. Instead of targeting people to manipulate them, it could line them up for help.”
“Math deserves much better than WMDs, and democracy does too.”
Should You Read Weapons of Math Destruction?
Maybe. If you want to learn more about some of the ways WMDs increase inequality and damage people’s lives, read this book. If you like books that focus on stories and examples, read this book.
However, if you want to learn about the way algorithms work, in general, this book may not be the best for you. If you are looking for a book packed with information about data science and algorithms, I think you can find a better book.
But as I said, Weapons of Math Destruction is still an interesting book, so it may be worth your while.
Have you read this book? If you have, what did you think about it?
Thanks for your review. I am looking for material that gives some prescriptions for how to protect yourself from the algorithms. Any suggestions, please? Happy New Year! Bob L