Study reveals that a crowd’s opinion is less taken into account than an algorithm’s

0
Study reveals that a crowd’s opinion is less taken into account than an algorithm’s

In a study publishedonApril 13, three researchers found that we tend to trust machines and algorithms more than other human beings in a decision-making situation. This study, financed to the tune of $300,000 by the American army, was led by Eric Bogert, a doctoral student in the Department of Information Technology at the University of Georgia, accompanied by Aaron Schecter and Richard Watson, both professors in the same department. Three experiments were conducted to test these results.

A battery of psychological experiments

In order to carry out this study, 1,500 people were asked on the Internet to count the number of people present in a series of photographs. With each new experiment, the number of individuals in the photographs increased. For each photograph, the test subjects had to choose between the suggestions of a group of 5,000 people (the crowd) or those of an algorithm trained beforehand using a database of 5,000 images.

This study is the first step in a long series of research as Aaron Schechter explains:

“The ultimate goal is to look at groups of humans and machines making decisions and find out how we can get them to trust each other and how that changes their behavior.”

Eric Bogert, the lead author of the publication detailing the entire study explains the whole point of the three experiments:

“Algorithms are capable of performing a large number of tasks, and the number of tasks they are capable of performing is increasing almost daily. There seems to be a cognitive bias that shows reliance on the algorithms, as the tasks that need to be done on a daily basis become more difficult and this effect is stronger than the bias of relying on the advice of others.”

Surprising results

The results are clear: when comparing the effects of the three experiments, we notice that the more people in the photograph, the more people will trust the algorithm to make their decision rather than the advice given by the crowd. This effect persists even when the quality of the advice, numeracy and accuracy of the subjects are controlled. One of the analyses that Aaron Schecter proposes for these results is that the individuals tested generally believe that a job that depends on counting is a more appropriate job to give to a trained algorithm than to a human being.

The latter goes on to discuss individuals’ perceptions of algorithms and AI:

“One of the common problems with AI is when it is used to grant credit or approve a person for loans. While it’s a subjective decision, there are a lot of numbers in there – like income and credit score – so people feel like it’s a good job for an algorithm. But we know that dependency leads to discriminatory practices in many cases because of social factors that aren’t taken into account.”

However, another fact is partially at odds with the initial survey results: the subjects in the experiment also tended to more strongly ignore inaccurate advice labeled as algorithmic compared to equally inaccurate advice labeled as crowd-sourced.

Translated from Une étude dévoile que l’avis d’une foule est moins pris en compte que celui d’un algorithme