A/76/380
purportedly reveal information: including one’s sexuality, 137 political preference 138 or
even criminality. 139 The accuracy and, in some cases, the scientific basis of these
technologies is heavily contested. Nonetheless, some argu e that irrespective of
whether these technologies violate mental privacy, they can and do still result in
punishment for inferred thought. 140 For example, Chinese authorities reportedly
deploy “emotion detection” technologies to infer “criminal” states of m ind among the
public, which could lead to administrative or criminal sanctions. 141 Moreover, several
corporations and educational institutions allegedly utilize biometric data to infer the
thoughts of their employees and students, respectively. Technology that monitors
employee brain activity in workplaces is already proliferating, and some scholars
postulate that employees might be punished for inferred thoughts, such as thoughts
on unionizing. 142
70. Recent research indicates that result rankings from Internet search engines have
a dramatic impact on consumer attitudes, preferences and behaviour – potentially
even modifying their very thoughts. For example, five experiments in the United
States and India have illustrated the power of search rankings to alte r the preferences
of undecided voters in democratic elections, noting that many users choose and trust
higher-ranked results over lower-ranked results. Research shows these practices could
have a significant impact on the users’ decision-making processes, including among
undecided voters, showing that they can lead to shifts in voting preferences by 20 per
cent or more. 143
71. Reportedly, Facebook has claimed that tweaking content on individuals’
“newsfeeds” could transfer emotions from person-to-person, 144 and that their
predictive marketing could identify when children feel “insecure”, “worthless” and
“need a confidence boost”. 145 In Kenya, finance applications allegedly have mined
their users’ mobile phone data to predict when they were most vulnerable to p redatory
credit offers. 146
72. Technology could disproportionately affect certain groups based on protected
characteristics (e.g., race, gender or religion or belief), including where it utilizes
artificial intelligence trained on data that reflects and pe rpetuates existing societal
discrimination, thereby affecting when and how their inferred thoughts are
scrutinized. For instance, one 2018 study found that certain emotion recognition
technologies erroneously assessed black faces as expressing anger in twi ce as many
instances as white faces; and disproportionately assigned them negative emotions
generally. 147
2.
Microtargeting
73. Microtargeting is the use of (often large volumes of) personal data gathered from
digital footprints to tailor what individuals or small groups see online. While
traditional advertising is mainly informative, modern advertising draws on techniques
__________________
137
138
139
140
141
142
143
144
145
146
147
20/28
See https://www.gsb.stanford.edu/faculty-research/publications/deep-neural-networks-are-moreaccurate-humans-detecting-sexual, p. 250.
See https://www.nature.com/articles/s41598-020-79310-1, p. 4.
See https://archive.ph/N1HVe.
Submission from Access Now.
See https://www.article19.org/wp-content/uploads/2021/01/ER-Tech-China-Report.pdf.
Submission from Nita Farahany.
See https://www.pnas.org/content/112/33/E4512.
See https://www.pnas.org/content/111/24/8788.
See https://www.theguardian.com/technology/2017/may/01/facebook -advertising-data-insecureteens. See also https://www.bbc.co.uk/news/technology-58570353.
See https://septemberpublishing.org/product/reset/.
See https://phys.org/news/2019-01-emotion-reading-tech-racial-bias.html.
21-14191