Weapons of Math Destruction
Books | Political Science / Public Policy / General
4.1
(165)
Cathy O'Neil
NEW YORK TIMES BESTSELLER • A former Wall Street quant sounds the alarm on Big Data and the mathematical models that threaten to rip apart our social fabric—with a new afterword “A manual for the twenty-first-century citizen . . . relevant and urgent.”—Financial Times NATIONAL BOOK AWARD LONGLIST • NAMED ONE OF THE BEST BOOKS OF THE YEAR BY The New York Times Book Review • The Boston Globe • Wired • Fortune • Kirkus Reviews • The Guardian • Nature • On Point We live in the age of the algorithm. Increasingly, the decisions that affect our lives—where we go to school, whether we can get a job or a loan, how much we pay for health insurance—are being made not by humans, but by machines. In theory, this should lead to greater fairness: Everyone is judged according to the same rules. But as mathematician and data scientist Cathy O’Neil reveals, the mathematical models being used today are unregulated and uncontestable, even when they’re wrong. Most troubling, they reinforce discrimination—propping up the lucky, punishing the downtrodden, and undermining our democracy in the process. Welcome to the dark side of Big Data.
AD
Buy now:
More Details:
Author
Cathy O'Neil
Pages
288
Publisher
Crown
Published Date
2016-09-06
ISBN
0553418823 9780553418828
Ratings
Google: 3.5
Community ReviewsSee all
"Every data analytics/data science/software engineering student or professional should read this and heed it's warning #theAlgorithm #socialJustice #privacy_and_surveillance #policing #recidivism #advertising "
A W
A W
"#statistics #missuse #behindthescenes #transparency #prejudice #decisiontheory "
J B
Jacqueline Burris
"This book changed the way I look at big data. <br/><br/>US News college rankings began as a one-time report to get a bump to their circulation for the week. But they were adopted broadly and became a system standard. Widespread faith in their model as a proxy for actual quality of the colleges is problematic because schools will focus on changing the metrics rather than changing things that more directly impact student success. For example, one of the metrics is number of applicants, which incentivizes schools to hike tuitions in order to build (potentially unnecessary) new buildings to try to attract more applicants. Safety schools are no longer safety schools. One of the US News metrics is the ratio of people who attend vs those who were accepted; former "safety schools" will now reject overqualified applicants to boost their acceptance rates.<br/><br/>O'Neil defines "weapons of math destruction" (WMDs) as being systems that are {1} algorithmic, {2} opaque, and {3} widespread. <br/>1. Because they're computer models running on their own, people may assume that they're unbiased even though they reflect the models that their creators used in building them. (People trust the US News ranking to derive results from an algorithm.)<br/>2. It's not possible to see inside the systems to question their results. This policy is due in part to the assumption that they're unbiased, and due in part to attempts to prevent cheating the models by pandering to their input parameters. (If a college is losing rank according to the US News rankings, they have no clear way of knowing why their rank has dropped.)<br/>3. Because they operate on a wide scale, the systems have a large impact on society as a whole. (The US News college ranking would hardly be problematic if most people ignored it; it becomes a WMD when the students applying to colleges and the colleges trying to attract those students both rely upon the model.)<br/><br/>O'Neil examines many WMDs, showing how they perpetuate systems of inequality:<br/><spoiler><br/>• Police use modeling software to determine where to patrol, hoping to enforce more effectively with limited resources. The software suggests areas based on past crimes. The police go to those areas, where they find and record petty crimes, which is fed back into the model, which tells them to patrol that same area.<br/>• Criminal recidivism algorithms are made with the intention of informing criminal sentencing by determining which people are most likely to commit crimes again. But they use people's location and acquaintances, giving harder sentences to people from poor neighborhoods, which makes it harder for them to get jobs when they get out of prison, which makes their neighborhood even poorer.<br/>• Resume screening software is made with the intention of saving time by narrowing down the list of candidates, but it may be trained on past resumes, carrying forward the biases of the humans who reviewed those resumes. Nearly half of employers screen based on credit scores, which makes it harder for people who have struggled financially to secure a job, making them more likely to struggle financially in the future, further worsening their credit scores and making it harder to find a job.<br/>• "E-score" proxies for credit scores (using in cases like marketing, when credit scores cannot legally be used) may conflate people with the same name, but there's no good way for people to set their records straight. People may be denied apartment leases because of crimes committed by someone else with the same name; they get no justification and have no way to appeal.<br/>• Driver's insurance uses these credit scores as a proxy for safe driving rather than using actual driving records. As a result, the cost of insurance is impacted more by ZIP code than by having a DUI on your record. (ZIP is a proxy for income and for race.) People in underprivileged ZIP codes are overcharged, while the wealthier, who have the time to negotiate and find alternate insurance are offered far better rates. (This is yet one more expense for the poor, making it likelier that they'll struggle to pay their bills on time, which will worsen their credit scores and make their insurance bills even higher.)<br/>• Scheduling software is made with the intention of employing people as cheaply as possible: it limits people to less than thirty hours per week to avoid having to pay for their health insurance; it increases efficiency by having the same people close and re-open the next day; it has people work at irregular hours, which makes it difficult to try to hold another part-time job.<br/>• For-profit colleges issue diplomas that are worth only as much as high school diplomas, but cost more than flagship state institutions. They rely on vulnerable, misinformed, low-income applicants taking out federal loans that they're unable to repay. Targeted advertising allows these for-profit colleges to select people at vulnerable stages of life, such as a death in the family, a divorce, or a release from the army or prison.<br/>• Politicians distribute different data to different people based on their profile data. A small fraction of voters in a few swing counties in a few swing states impact the outcome of national elections. Wealthy donors can pay the politicians to advertise to those few swing voters, while most other voters are ignored. The data profiles of those voters provides the information to change the outcome of elections.<br/></spoiler><br/><br/>In the conclusion, O'Neil proposes some solutions that seem like they may be idealistic and impractical. This book gives a good overview of many major problems and shows how they're interconnected, but the suggestions to fix those solutions felt unsubstantiated and rushed. I don't know if the author can be blamed for that; there are no easy solutions.<br/><br/>To someone who reads The New York Times more than I do, there's probably not a lot new in this book. O'Neil does a good job of synthesizing information, but a lot of that information comes from NYT articles."
a
aqword