This paper examines the impact of AI-generated imagery, considering its potential to both perpetuate existing inequalities and challenge entrenched biases. It emphasises the necessity of a critical approach to AI, particularly with text-to-image models, and advocates for the urgent integration of gender and decolonial perspectives. These viewpoints are crucial for viewing technologies as collaborative agents rather than substitutes. Employing a theoretical framework grounded in a more-than-human perspective and AI-centered semiotics, this work analyses an artistic project generated through a text-to-image model. It demonstrates how AI can actively participate in cultural and social narratives and potentially redress colonial legacies through careful dataset curation. By highlighting the importance of addressing biases within AI algorithms and data structures to prevent the replication of racial, gender, and class hierarchies, the paper discusses the role of datasets in visualising and giving presence to historically marginalised communities. In conclusion, it advocates for a future where AI and its imagery not only challenge existing social structures but also promote social justice and equity, thereby transforming the landscape of visual culture and technology. This transformative approach necessitates reevaluating the relational dynamics between humans and machines, recognising AI’s potential as a transformative societal force.
Enunciative Practices and Dataset Curation: Toward a Decolonial Semiotics of AI-Generated Imagery
Cristina Voto
First
2026-01-01
Abstract
This paper examines the impact of AI-generated imagery, considering its potential to both perpetuate existing inequalities and challenge entrenched biases. It emphasises the necessity of a critical approach to AI, particularly with text-to-image models, and advocates for the urgent integration of gender and decolonial perspectives. These viewpoints are crucial for viewing technologies as collaborative agents rather than substitutes. Employing a theoretical framework grounded in a more-than-human perspective and AI-centered semiotics, this work analyses an artistic project generated through a text-to-image model. It demonstrates how AI can actively participate in cultural and social narratives and potentially redress colonial legacies through careful dataset curation. By highlighting the importance of addressing biases within AI algorithms and data structures to prevent the replication of racial, gender, and class hierarchies, the paper discusses the role of datasets in visualising and giving presence to historically marginalised communities. In conclusion, it advocates for a future where AI and its imagery not only challenge existing social structures but also promote social justice and equity, thereby transforming the landscape of visual culture and technology. This transformative approach necessitates reevaluating the relational dynamics between humans and machines, recognising AI’s potential as a transformative societal force.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.



