problem with Technology, many claim, is its quantitative propensity, its “hard” math deployed in a softer human world.Tech is Mark Zuckerberg: All the girls who get pretty into numbers And raved about social miracles metaverse Although it was so awkward in every human interaction that he was instantly hooked. The human world contains Zach, but that’s everything he fails so well. This failure, and lack of a social and moral seal, is what many believe he shares in an industry with which he is so relevant.
So because Big Tech fails to understand humans, we often hear that its staff just needs to hire more people Do understand. picture”Liberal arts majors are the future of tech” and”Why Computer Science Needs Humanities“It has been a recurring feature in tech and business articles over the past few years. It has been suggested that social workers and librarians may help the tech industry curb social media’s black youth and diffusion false information, respectively. Many anthropologists, sociologists, and philosophers—especially those who feel the economic pressure of academia to favor STEM with advanced degrees—are scrambling to demonstrate their utility to the tech giants, whose startups Salaries will make general humanities professors blush.
For the past few years, I have been researching unskilled workers in the tech and media industries. The argument of “bringing in” sociocultural experts ignores the fact that these roles and workers already exist in the tech industry and have always existed in various ways. For example, many current UX researchers have advanced degrees in sociology, anthropology, library and information science. Faculty and EDI (Equity, Diversity, and Inclusion) specialists often hold positions in technical HR departments.
Recently, however, the technology industry Yes Explore where non-technical expertise can address some of the societal issues associated with their products. A growing number of tech companies look to law and philosophy professors to help them address the legal and ethical intricacies of platform governance, activists and critical academics to help protect marginalized users, and other experts to assist with platform challenges such as algorithmic oppression , disinformation, community management, user health, digital activism and revolution. These data-driven industries are striving to augment their technical knowledge and vast amounts of data with social, cultural, and ethical expertise (or what I usually call “soft” data).
But you can add all the soft data workers you want, and unless the industry values this kind of data and expertise, little will change.In fact, many scholars, policy experts, and other sociocultural experts in the field of AI and technology ethics are notice A disturbing trend is that tech companies seek their expertise and then ignore it in favor of more tech jobs and workers.
Such experiences are especially clear at this worrisome moment in the burgeoning field of AI ethics, where the tech industry may claim to include non-technical roles while actually adding an ethical and sociocultural framework to job titles , and those titles end up being held by “same old” techies. What’s more, in our fondness for these often-underrated “soft” professions, we can’t ignore their limitations in achieving the lofty goals set for them.
Although it is Important to the critical work done to support these undervalued and underresourced professions, there is no reason to believe that their members are inherently better equipped to be moral arbiters. These people have very real and significant social and cultural expertise, but their fields are all considering their own structural dilemmas and weaknesses.
Take anthropology, for example, a discipline that emerged as an important part of the Western colonial program. Although cultural anthropology now often supports social justice goals, there is no guarantee that anthropologists (85% of whom are white in the US) will target or deploy algorithms in a less biased way than computer scientists.Perhaps the most notorious example is PredPol, which Ruha Benjamin calls part of a multi-million dollar predictive policing company New Jim Code. PredPol was created by UCLA Anthropology Professor Jeff Brattingham.
Similar conflicts exist in other academic circles that push soft data.Sociology plays a role in early monitoring and quantification of black populations Role In the vast majority of surveillance technology that monitors black communities today.My own field of study, critical internet research, is white and has lose Focus on race and racism. In fact, I am often one of the few black and brown researchers who attend conferences in our field. At times, I am surrounded by the diversity of the tech industry field, rather than the academic field where the major criticisms of Big Tech come from.