A study released by the American Psychological Association shows that people feel less morally outraged when sexism occurs due to an algorithm rather than direct human involvement. In the study, the researchers coined the term "algorithmic anger deficit" to describe their finding

2024/05/0205:03:33 science 1512

A study published by the American Psychological Association shows that people feel less morally outraged when sexism occurs due to an algorithm rather than direct human involvement.

In the study, the researchers coined the term "algorithmic anger deficit" to describe their findings across eight experiments involving more than 3,900 participants from the United States, Canada and Norway.

When faced with various scenarios of sexism in hiring decisions caused by algorithms and human factors, participants reported lower levels of moral outrage about algorithm-driven hiring decisions. Participants also believed that companies would be less legally responsible if discrimination was due to an algorithm.

A study released by the American Psychological Association shows that people feel less morally outraged when sexism occurs due to an algorithm rather than direct human involvement. In the study, the researchers coined the term

postdoctoral fellow at Yale University and Hebrew University incoming assistant professor and lead researcher Dr. Yochanan Bigman said: "This raises concerns that companies may use algorithms to protect themselves from accusations and publicity. A review of discriminatory practices. Bigman believes the findings could have broader implications for efforts to combat discrimination. The study was published online in the Journal of Experimental Psychology: An Overview.

Bigman said: "People think the people who are discriminating are motivated by bias, such as racism or sexism, but they think the algorithms that are discriminating are motivated by data, so they are less morally outraged. Moral outrage is an important social mechanisms that can motivate people to address injustice. If people are less morally outraged by discrimination, they may be less motivated to take action.

Some experiments used a scenario based on Amazon . A real-life example of alleged algorithm-based sex discrimination that punished female job applicants. While this study focused on sexism, one of eight experiments looking at racial and age discrimination came to similar conclusions.

's knowledge about artificial intelligence doesn't seem to be any different. In an experiment with more than 150 tech workers in Norway, participants who reported more knowledge about artificial intelligence remained less angry about sexism caused by algorithms.

Researchers have found that when people learn more about a specific algorithm, it can affect their prospects. In another study, participants were angrier when a male programmer created a hiring algorithm that resulted in sexism at a company known for sexist behavior.

Bigman said programmers should be aware of the possibility of accidental discrimination when designing new algorithms. Public education campaigns could also highlight that discrimination caused by algorithms can be the result of existing inequalities, he said.

This article was compiled by Shengzhe Technology. Please indicate when reprinting.

science Category Latest News