Week 3 Reader Blog

Chloe Xiang, “Scientists Increasingly Can’t Explain How AI Works”

The article points out an interesting argument that most AI systems are black-box models, which means they are “viewed only in terms of their inputs and outputs”. The problem with it is that “AI systems notoriously have issues because the data they are trained on are often inherently biased, mimicking the racial and gender biases that exist within our society”. In other words, since the data being used to train AI systems are biased, the result AI generates repeats the inherent racial and gender biases in society. Therefore, when we receive the AI results, we should always question its credibility. AI developers and researchers should make an effort to erase racial and gender biases in AI systems.

As Xiang writes in the article, “When we put our trust in a system simply because it gives us answers that fit what we are looking for, we fail to ask key questions: Are these responses reliable, or do they just tell us what we want to hear? Whom do the results ultimately benefit? And who is responsible if it causes harm?” These are great and critical questions that highlight humans’ new challenges in the age of AI. AI is a product made by a few groups of researchers or scientists, most of whom are white straight males. Thus, the inputs are not accurate in the first place. Then, we should consider who would benefit from the results and how they work for them. All these questions are critical in terms of improving the credibility and fairness of AI.

AI Decolonial Manyfesto

Similar to the first one, this article challenges the prevailing Western-centric perspectives in the development and application of AI. The article proposes to erase the systematic biases in AI and emphasizes the importance of considering diverse racial, cultural, and social contexts in the AI discourse. Based on the lines: “‘Artificial’ and ‘intelligence’ are loaded terms, their definitions subject to cultural biases. AI is a technology, a science, a business, a knowledge system, a set of narratives, of relationships, an imaginary”. It’s clear that AI, “as much of the technology has, is dominated by Western male voices, whiteness, and wealth”. Therefore, this article proposes to create a more ethical, comprehensive, and diverse environment in AI.

With that being said, the idea in “AI Decolonial Manyfesto” aligns with Xiang’s article, and both criticize the current biased AI systems and urge for changes. This article also introduces the idea of “decolonization” and points out the hegemonic narratives in AI. When thinking about AI and its function, this article inspires us to consider hidden hegemony and power dynamics in AI.

Annette Michelson, “Bodies in Space: Film as ‘Carnal Knowledge’”

Different from the other two readings, Annette Michelson’s 1969 essay “Bodies in Space: Film as Carnal Knowledge” is an exhaustive study of Kubrick’s film “2001: A Space Odyssey.” This article explores why and how this film was a significant work in film history and how it challenges traditional views of film through its “revolutionary use of space, movement, and corporeality”. Michelson argues that the viewer’s bodily experience and perception are central to understanding and engaging with the film.

Michelson provides many intriguing theories about film and its philosophy, one of my favorites is: “Cinema is the temporal instrument working in a direction counter to that of modernist painting’s increasingly shallow space, through which the deep space of illusionism is reinvented. In assuming the burden of illusionism, cinema reintroduces not only “lived reality,” but an entirely new and seemingly limitless range of structural relationships allowing for the reconciliation of ‘lived reality’ with ‘artistic form’”. It perfectly explains what Kubrick achieved in his film “2001: A Space Odyssey,” – cinema utilizes time to reinvent the depth of illusion, merging ‘lived reality’ with ‘artistic form’ through new structural relationships.”


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *