A/80/278
concentrating large volumes of AI-produced works, this leads to the invisibility of
works representing these cultures, perpetuating pre-existing inequalities.
Underrepresentation can also lead to stereotypical representations, offering
folklorized or stigmatizing views of these groups, reinforcing prejudices,
disrespecting their cultural identities and harming their dignity. 57 All these negative
elements have been gradually exposed in the literature on AI in the past few years, to
such a degree that scholars wonder whether AI is becoming the new colonizer of
Indigenous Peoples. 58
36. Decontextualization is evident in the fashion industry. The use by AI of
Indigenous garments and motifs is reduced to mere aesthetics, which strips those
garments and motifs of the cultural memory that grounds their meaning. Prints
“inspired” by Maori or Nigerian heritage motifs, taken without any acknowledgment
of where those motifs came from or of who designed them, and without consultation
with the source community, violate cultural rights and potentially propagate harm. AI
systems have disassociated designs from their social and historical lineages,
converting them into commodified fragments in a broader process of algorithmic
consumption. 59 AI tools could be used with the active participation and consent of the
communities, which would ensure the coherent use of the designs in line with their
real meaning and significance.
37. Initiatives to increase the availability of data related to marginalized groups in
digital environments without the consent of the source community carry risks. They
can result in a loss of control over narratives and cultural representations, 60 as well as
cultural appropriation, whereby “ancestral knowledge, sacred art forms and
traditional expressions become training data.” 61
38. In some emerging practices, the groups concerned are involved at every stage
of projects that affect them. For example, the Creative Labour and Critical Futures
research cluster works with minorities. 62 The Mila – Quebec Artificial Intelligence
Institute, led by Michael Running Wolf, uses AI to document and revive Indigenous
languages in cooperation with local communities to the extent that they agree their
data may be used.
39. AI-generated content may also reinforce stereotypical representations of
women. These biases stem from training data sets that underrepresent women ’s
voices, experiences and contributions, or that overrepresent them in traditional or
objectifying roles. “The underrepresentation of women in AI development and
leadership roles can further lead to the creation of socio -technical systems which fail
to consider the diverse needs and perspectives of all genders, once again perpetuating
stereotypes and gender disparities.” 63 As a result, AI systems can contribute to the
rendering invisible of the diversity of women’s identities and roles in societies. This
impedes the exercise of women’s cultural rights.
40. Generative AI tools pose specific challenges for women and girls, notably by
reinforcing gender-based discrimination and enabling new forms of harm to their
dignity and integrity. For instance, “text-to-image models can easily generate images
__________________
57
58
59
60
61
62
63
25-12403
Submission of the Human Rights Ombudsman of Guatemala.
Jason Edward Lewis, ed., Indigenous Protocol and Artificial Intelligence, position paper
(Honolulu, 2020).
Submission by Indira Boutier, p. 7.
Submission by Centro de Investigación y Docencia Económicas and Artículo 19 Oficina para
México y Centroamérica, p. 3.
Submission by Brazil, p. 2.
Submission by Creative Labour and Critical Futures, p. 5.
UNESCO and International Research Centre on Artificial Intelligence, “Challenging systematic
prejudices: an investigation into gender bias in large language models ” (Paris, 2024), p. 5.
13/21