The project focuses on gender as one important aspect of AI fairness to study how gender is enacted in multiple ways through computational practices. Amidst the complexity of translating a social concept like fairness into computational procedures, the attempt of implementing fairness into algorithms reflects and contributes to the public demand for accountable and trustworthy AI. Empirically, the project draws on ethnographic fieldwork in a credit scoring agency that assessed and implemented fairness in their scoring procedures. Conceptually, it builds on the concept of the “folded object” to study how practices and knowledges are related to unfolding and refolding gender in the scoring algorithm. Understanding algorithms through the notion of the folded object allows for analysing how gender is made absent and present reflecting its historical, temporal and political enactments. The project contributes to STS by exploring how gender is mediated in various ways in development practices as well as to material-semiotic approaches on AI.
(Un)Folding Algorithmic Fairness. Gender as Absent Presence in Credit Scoring
