“Algorithmic Compensation” Initiative Calls for Racial Justice in Artificial Intelligence

[ad_1]

Proponents of algorithm restoration suggest learning from curatorial professionals such as librarians, who must consider how to ethically collect data about those involved and what the library should include.They suggest not only to consider whether the performance of the AI ​​model is fair or good, but also Whether to transfer power.

These suggestions echo the earlier suggestions of former Google AI researchers Timnit Gebru, Who is in the 2019 paper encourage Machine learning practitioners consider how archivists and library science deal with issues involving ethics, inclusiveness, and power. Gebru said that Google fired her at the end of 2020, and recently roll out Distributed Artificial Intelligence Research Center.A critical analyze The conclusion is that Google has exposed Gebru to a historical pattern of abuse against black women in professional settings. The authors of the analysis also urge computer scientists to look for historical and social patterns outside of data.

Earlier this year, five U.S. Senators Urge Google Hire independent auditors to evaluate the impact of racism on Google products and workplaces. Google did not respond to this letter.

In 2019, four Google AI researchers debate The field of responsible artificial intelligence requires a critical theory of race, because most of the work in this field does not consider the social construction of race, nor does it recognize the influence of history on the collected data sets.

“We emphasize that data collection and annotation work must be based on the social and historical background formed by ethnic classification and ethnic categories,” the paper reads. “Oversimplification is to create violence, or even more, to reintroduce violence into communities that have already experienced structural violence.”

The lead author, Alex Hanna, was one of the first sociologists hired by Google and the lead author of the paper. After Gebru left, she bluntly criticized Google executives.Hannah said she Appreciate Critical race theory concentrates race in conversations about what is fair or moral, and can help reveal historical oppression patterns.Since then, Hannah co-authored a paper, which was also published in Big data and society How to face face recognition Technology strengthens the gender and ethnic structure that can be traced back to colonialism.

At the end of 2020, Margaret Mitchell, who led Google’s ethical artificial intelligence team with Gebru, Say The company began to use critical racial theory to help determine what is fair or ethical. Mitchell was fired in February. A Google spokesperson said that critical race theory is part of the process of reviewing artificial intelligence research.

other PaperWritten by Rashida Richardson, White House science and technology policy adviser, and will be published next year, he believes that if the effects of apartheid are not recognized, American artificial intelligence cannot be thought of. The legacy of laws and social norms that control, exclude, and otherwise oppress black people is too great.

For example, research has found that algorithms are used Block apartment tenants and Mortgage applicant Not good for black people. Richardson said that before the passage of the civil rights law in the 1960s, it must be remembered that federal housing policies clearly required segregation. The government has also colluded with developers and homeowners to deprive people of color of opportunities and separate ethnic groups. She said segregation among whites in homeowners’ associations, school boards, and labor unions contributed to “cartel-like behavior.” In turn, the practice of segregated housing can exacerbate problems or privileges related to education or generational wealth.

Richardson said that the historical pattern of segregation has poisoned the data on which many algorithms are based, for example, it is used to classify what is a “good” school or to policing Brown and black communities.

“Segregation has played a core evolutionary role in the replication and amplification of racial stratification in data-driven technologies and applications. Segregation also limits the conceptualization of algorithmic bias issues and related interventions,” she wrote. “If the impact of apartheid is ignored, the issue of racial inequality will become a natural phenomenon, rather than a by-product of specific policies, practices, social norms and behaviors.”

[ad_2]

Source link

Recommended For You

About the Author: News Center