New Report From Pew Research Looks at Public Attitudes Toward Computer Algorithms

The American public expresses broad concerns over the fairness and effectiveness of computer programs making important decisions in people’s lives, according to a new survey from Pew Research Center. Indeed, 58% of Americans say that computer programs will always reflect some level of human bias, compared with 40% who think these programs can be designed in a way that is bias-free.

The survey of nearly 4,600 U.S. adults presented respondents with four different real-world scenarios in which computers make decisions by collecting and analyzing large quantities of public and private data: a personal finance score used to offer consumers deals or discounts; a criminal risk assessment of people up for parole; an automated resume screening program for job applicants; and a computer-based analysis of job interviews

Overall, the public expresses broad concerns about the fairness and acceptability of using computers for decision-making in situations with important real-world consequences. For example, minorities of Americans think that the video job interview (33%) and personal finance score (32%) algorithms would be fair to job applicants and consumers, respectively. And when asked directly whether they think the use of these algorithms is acceptable, a majority of the public says that they are not.

The survey also finds that attitudes toward algorithmic decision-making can depend on the context of those decisions and the characteristics of the people who might be affected. For instance, half (50%) of U.S. adults think that automated criminal risk scores would be fair to people who are up for parole, but just 32% think that an automated personal finance score would be fair to consumers.

And in the specific context of the algorithms that power most social media platforms, users’ comfort level with sharing their personal information also depends heavily on why that data is being used. Fully 75% of social media users say they would be comfortable sharing their data with those sites if it were used to recommend events they might like to attend. That share drops to just 37% if their data is being used to deliver messages from political campaigns.

These social media users also report being exposed to a mix of positive and negative content on these sites. Fully 71% of social media users say they have seen content there that makes them angry, with 25% saying they see this sort of content frequently. Meanwhile, 58% of users say they frequently encounter posts that are overly exaggerated, while 59% frequently encounter posts in which people are making accusations or starting arguments without waiting until they have all the facts.

This report is drawn from a survey conducted as part of the American Trends Panel, a nationally representative panel of randomly selected U.S. adults living in households recruited from landline and cellphone random-digit-dial surveys. The panel, which was created by Pew Research Center, is being managed by GfK. Data in this report are drawn from the panel wave conducted May 29-June 11, 2018, among 4,594 respondents. The margin of sampling error for the full sample of 4,594 respondents is plus or minus 2.4 percentage points.

Other Key Findings:

As is often true of users’ experiences on social media more broadly, negative encounters are accompanied by more positive interactions. One-quarter of users (25%) say they frequently encounter content that makes them feel angry, but a comparable share (21%) says they frequently encounter content that makes them feel connected to others. And 44% report that they frequently see content that makes them amused.

Public attitudes towards algorithmic decision-making can vary by factors related to race and ethnicity. Just 25% of whites think the personal finance score concept would be fair to consumers, but that share rises to 45% among blacks. By the same token, 61% of blacks think the criminal risk score concept is not fair to people up for parole, but that share falls to 49% among whites.

Roughly three-quarters of the public (74%) thinks the content people post on social media is not reflective of how society more broadly feels about important issues. But 25% think that social media does paint an accurate portrait of society.

Social media users ages 18 to 29 are twice as likely to say they frequently see content on social media that makes them feel amused (54%) than see content that makes them feel angry (27%). But users age 65 and older encounter these two types of content with more comparable frequency: 30% of older users frequently see content on social media that makes them feel amused, while 24% frequently see content that makes them feel angry.port

Gary Price (gprice@mediasourceinc.com) is a librarian, writer, consultant, and frequent conference speaker based in the Washington D.C. metro area. Before launching INFOdocket, Price and Shirl Kennedy were the founders and senior editors at ResourceShelf and DocuTicker for 10 years. From 2006-2009 he was Director of Online Information Services at Ask.com, and is currently a contributing editor at Search Engine Land.