A/75/590
importance of existing international human rights legal obligations in the
regulation of the design and use of these technologies.
58. At both the domestic and international levels, Member States must ensure
that border and immigration enforcement and administration are subject to
binding legal obligations to prevent, combat and remedy racial and xenophobic
discrimination in the design and use of digital border technologies. These
obligations include but are not limited to:
(a) Swift and effective action to prevent and mitigate the risk of the
racially discriminatory use and design of digital border technologies, including
by making racial equality and non-discrimination human rights impact
assessments a prerequisite for the adoption of systems before they can be publicly
deployed. These impact assessments must incorporate meaningful opportunity
for co-design and co-implementation with representatives of racially or
ethnically marginalized groups, including refugees, migrants, stateless persons
and related groups. A purely or even mainly voluntary approach to equality
impact assessments will not suffice; a mandatory approach is essential;
(b) An immediate moratorium on the procurement, sale, transfer and use
of surveillance technology, until robust human rights safeguards are in place to
regulate such practices. These safeguards include human rights due diligence
that complies with international human rights law prohibitions on racial
discrimination, independent oversight, strict privacy and data protection laws,
and full transparency about the use of surveillance tools such as image
recordings and facial recognition technology. In some cases, it will be necessary
to impose outright bans on technology that cannot meet the standards enshrined
in international human rights legal frameworks prohibiting racial
discrimination;
(c) Ensuring transparency and accountability for private and public
sector use of digital border technologies, and enabling independent analysis and
oversight, including by only using systems that are auditable;
(d) Imposing legal obligations on private corporations to prevent, combat
and remedy racial and xenophobic discrimination due to digital border
technologies;
(e) Ensuring that public-private partnerships in the provision and use of
digital border technologies are transparent and subject to independent human
rights oversight, and do not result in abdication of government accountability
for human rights.
59. The Special Rapporteur had the opportunity to consult with representatives
of UNHCR and IOM on their use of different digital border technologies. Based
on those consultations, she recommends that both bodies adopt and implement
mechanisms for sustained and meaningful participation and decision-making by
migrants, refugees and stateless persons in the adoption, use and review of digital
border technologies. She makes the recommendations set out below.
60.
IOM should:
(a) Mainstream and strengthen international human rights obligations
and principles, especially relating to equality and non-discrimination in its use
and oversight of digital border technologies, including in all its partnerships with
private and public entities. This requires moving beyond a narrow focus on
privacy concerns relating to data sharing and data protection, and mandating
rather than recommending equality and non-discrimination protections;
24/25
20-14872