People really don’t like being judged by machines.
A wide majority of Americans say they worry algorithms will treat them unfairly when used to make judgments in crucial circumstances, such as during job interviews, parole proceedings, or financial transactions, according to a new survey released Friday by Pew Research Center. Algorithms violate privacy, they can be unfair, and they eliminate the human element necessary to make crucial decisions, subjects told surveyors.
The results come during a week when Facebook is reeling from repeated accusations that its news feed algorithm is still being abused by American enemies.
Computer scientists have been ringing the alarm bell about bias in algorithms for some time, and it appears the general public shares their skepticism.
“Despite the growing presence of algorithms in many aspects of daily life…the public is frequently skeptical of these tools when used in various real-life situations,” the report said.
Results from the Pew survey suggest an across-the-board concern by Americans about math being used against them
- 56% of Americans think use of a “criminal risk score” is unacceptable. When asked why, the top two reasons supplied were every “individual/circumstance is different” and “people can change.” But use of criminal risk scores had its supporters, too. One told researchers, “While such a program would have its flaws, the current alternative of letting people decide is far more flawed.”
- 57% say automated resume screening is “unacceptable,” and 67% say “computer-aided video job analysis” video screening is unacceptable.
- 58% of Americans say that computer programs will always reflect some level of human bias, compared with 40% who think these programs can be designed in a way that is bias-free. Older Americans (50-plus) are much more likely to think this (63%) than younger Americans 18-29 (48%), but no age group showed great confidence in bias-free algorithms.
- Only about one-third say they think that the video job interview (33%) and personal finance score (32%) algorithms would be fair to job applicants and consumers.
Depending on context, there are instances where some groups express less distrust of algorithms. Just 25% of whites think the personal finance score concept would be fair to consumers, but that share rises to 45% among blacks. On the other hand, 61% of blacks think the criminal risk score concept is not fair to people up for parole, but that share falls to 49% among whites.
Pew offers these explanations for the wide distrust of algorithms:
- They violate privacy. This is the top concern of those who find the personal finance score unacceptable, mentioned by 26% of such respondents.
- They are unfair. Those who worry about the personal finance score scenario, the job interview vignette and the automated screening of job applicants often cited concerns about the fairness of those processes in expressing their worries.
- They remove the human element from important decisions. This is the top concern of those who find the automated resume screening concept unacceptable (36% mention this), and it is a prominent concern among those who are worried about the use of video job interview analysis (16%).
- Humans are complex, and these systems are incapable of capturing nuance. This is a relatively consistent theme, mentioned across several of these concepts as something about which people worry when they consider these scenarios. This concern is especially prominent among those who find the use of criminal risk scores unacceptable. Roughly half of these respondents mention concerns related to the fact that all individuals are different, or that a system such as this leaves no room for personal growth or development.