A/HRC/42/59
Kingdom, the World Bank social inclusion metrics in South America, the European
Union Agency for Fundamental Rights Being Black in the EU report and the seven
commitments on the rights of Afrodescendants, including priority measure 98 on
disaggregated data, of the Montevideo Consensus on Population and Development,
which provide important analyses and instructive methodologies for similar efforts in
other States.
67.
The Working Group notes the importance of social inclusion efforts and the
relevance of disaggregated data collection in that regard, including in revealing the
access to spaces of life and work, the participation in markets and the educational
trajectories of people of African descent.
68.
The Working Group notes the invisibility of some populations of people of
African descent, including many in Asia, despite their presence in societies for
generations, and the ability of data disaggregated by race to render visible those
populations. People of African descent remain socially and structurally invisible
within societies in some States, particularly in Asia, owing to, inter alia, the lack of
statistical data, including data disaggregated along ethnic lines.
69.
The Working Group acknowledges that the collection of disaggregated data
and the availability of open data have allowed public interest to drive deeper, critical
analyses of entrenched racial disparities and racially driven outcomes that disfavour
people of African descent systematically. Those analyses have fuelled new
understandings of the factors that drive ongoing racial bias and disadvantage.
70.
The Working Group recognizes the importance of Member States’ prioritizing
open data, public access and increased diversity among data scientists.
71.
Yet data systems and algorithms often incorporate, mask and perpetuate
racism in their design and operation – and the Working Group expresses concern that
that is considered an acceptable cost for convenient data solutions.
72.
The Working Group wishes to highlight that biased policing techniques, such as
broken windows policing, stop and frisk or “carding”, contribute to biased police
data. The use of historical data sets in new analyses and the maintenance of biased
policing techniques to generate new data pose a serious threat to human rights.
73.
The Working Group notes with concern that little or no effort has been made to
ensure that racial biases reflected throughout society have not been embedded in
algorithms, coding and data-driven commercial and military products, like facial
recognition software, autonomous weapons systems and signature strike targeting
programs.
74.
The Working Group understands the ongoing influence of mindsets that
channel certain narratives, including racially biased beliefs, and remain embedded in
decision-making, and the importance of surfacing those views to mitigate their impact,
particularly in computerized algorithms that may lack reflective capacity or effective
independent oversight.
75.
The Working Group notes that people of African descent have been subjects of
experimentation historically, including ongoing data collection and surveillance
without consent. It expresses concern that similar exploitation and experimentation
continues via social media platforms and other big data initiatives.
76.
The development of new technologies must reflect a strong commitment to
human rights and human dignity. The reliance on algorithms to identify risk, target
misconduct and carry out operations should not violate the human rights of people of
African descent.
77.
The Working Group notes the importance of historical data for people of
African descent who lost family, culture and identity in the transatlantic trade in
enslaved Africans. The liberatory power of historical data has offered truth, history
and paths towards reparation and reconciliation for people of African descent in the
diaspora.
14