A/HRC/48/Add.xx
individual around data access if to do otherwise would “prejudice effective immigration
control.”85 These rights include the rights to object to and restrict the processing of one’s data
and the right to have one’s personal data deleted.86 The UK’s amended Police Act empowers
not only police but also immigration officers to interfere with mobile phones and other
electronic devices belonging to asylum seekers.87 Going far beyond even the data carrier
evaluation permitted in Germany, the UK Crime and Courts Act of 2013 enables police and
immigration officers to carry out secret surveillance measures, place bugging devices, and
hack and search mobile phones and computers.88 The individuals affected will
disproportionately be targeted on national origin grounds when national origin should never
be a basis for diminished privacy and other rights.
B.
Discriminatory Structures
35.
In her previous report, the Special Rapporteur showed how the design and use of
different emerging digital technologies can produce racially discriminatory structures that
undermine enjoyment of human rights for certain groups, on account of their race, ethnicity
or national origin, in combination with other characteristics. She urged that emerging digital
technologies should be understood as capable of creating and sustaining racial and ethnic
exclusion in systemic or structural terms. In this sub-Section, the Special Rapporteur
highlights ways in which migrants, refugees, stateless persons and related groups are being
subjected to technological interventions that expose them to a broad range of actual and
potential rights violations on the basis of actual or perceived national origin or immigration
status.
1
Surveillance Humanitarianism and Surveillance Asylum
36.
Commentators have cautioned of the rise of “surveillance humanitarianism”89,
whereby increased reliance on digital technologies in service provision and other
bureaucratic processes perversely result in the exclusion of refugees and asylum seekers from
essential basic necessities such as access to food. 90 Even a misspelled name can result in
“bureaucratic chaos” and accusations of providing false information, slowing down what is
already a slow asylum process.91 Potential harms around data privacy are often latent and
violent in conflict zones, where data compromised or leaked to a warring faction could result
in retribution for those perceived to be on the wrong side of the conflict.92
37.
In this regard, one submission highlights the dangers associated with the growing use
of digital technologies to manage aid distribution.93 In refugee camps in Afghanistan, iris
registration has reportedly been used as a pre-requisite for receiving assistance for returning
Afghan refugees.94 The impact of collecting, digitizing and storing the refugees’ iris can be
grave when systems are flawed or abused.95 It has also been documented that such biometric
surveillance tools have led to system aversion and loss of access to goods and services for
survival.96 This submission noted, for example, the failure of technology in Rohingya refugee
camps in Bangladesh that resulted in the denial of food rations to refugees.97 UNHCR
reported to the Special Rapporteur that its policy is that safeguards should be in place to
85
86
87
88
89
90
91
92
93
94
95
96
97
12
PICUM, Submission.
Ibid.
GFF, Submission.
Ibid.
https://www.nytimes.com/2019/07/11/opinion/data-humanitarian-aid.html.
Beduschi, Submission.
Mark Latonero et al., Digital Identity in the Migration & Refugee Context: Italy Case Study (April
2019).
https://www.nytimes.com/2019/07/11/opinion/data-humanitarian-aid.html.
Amnesty International, Submission.
Ibid.
Ibid. citing A/HRC/39/29.
Amnesty International, Submission.
Ibid.