A/HRC/56/68
pandemic, the use of pulse oximetry devices to measure low oxygen levels in the blood led
to overestimations of the levels of oxygen in the blood of people with darker skin tones. 67
3.
Education
(a)
Academic and career success algorithms
44.
In countries such as Finland and the United States, predictive analytics tools are used
in education to determine the likelihood of future success on the basis of data, statistical
algorithms and machine learning. 68 The data used in these algorithms include data on
attendance, grades, behaviour and online activity. They are designed to help educators to
guide students in decisions about their educational and career journeys. While the predictive
analytics tools are intended to assist educators in improving outcomes for students, they often
rate racial minorities as less likely to succeed academically and in their careers, because of
algorithm design and data choices. On the basis of these ratings, educators may steer students
from marginalized racial and ethnic groups away from educational and career choices that
would maximize their potential and offer the best opportunities to break cycles of exclusion
or invest fewer resources in these students.
(b)
Grading algorithms
45.
Grading algorithms typically use historical grading data to evaluate student
performance. Such data can be biased by historical patterns of systemic racism in educational
institutions. The bias in the data will be replicated by predictive scoring algorithms for
students, especially when teacher input is excluded. 69 Grading algorithms can be hugely
consequential in determining the opportunities available to students, including in relation to
access to university education or employment opportunities after education. Racially biased
automated decisions may therefore limit opportunities for students from marginalized racial
and ethnic groups and undercut the potential of education to be a tool to disrupt systemic
racism.
46.
The United Kingdom provides a cautionary example of the deployment of a grading
algorithm. In 2020, Advanced Level (A-level) examinations were cancelled due to the
COVID-19 pandemic. As a substitute for examination grades, teachers were asked to predict
students’ results. The national regulatory agency for grading then deployed an algorithm to
standardize the predicted scores on the basis of each school’s historical grading data. Forty
per cent of students, many of whom attended schools in lower-income areas, had their scores
downgraded as a result. Conversely, the algorithm upgraded a disproportionally high number
of students from independent, fee-paying schools. The Government responded to the
controversy by reversing the algorithm’s standardization. However, the episode caused
significant disruptions to university admissions processes. 70
(c)
Large language models in education
47.
Generative artificial intelligence tools rely on large language models to produce novel
content, including text, music, images and videos. Large language models are being used in
educational settings and can assist with improving academic outcomes for students of all
ages. Studies have shown that language models are biased towards English, which is the most
widely used language on the Internet and the language in which most artificial intelligence
researchers and technologists work. Moreover, only a handful of the approximately
67
68
69
70
12
Privacy International submission.
Stina Westman and others, “Artificial intelligence for career guidance – current requirements and
prospects for the future”, International Academic Forum Journal of Education, vol. 9, No. 4 (2021);
and Kelli A. Bird, Benjamin L. Castleman and Yifeng Song, “Are algorithms biased in education?
Exploring racial bias in predicting community college student success”, Journal of Policy Analysis
and Management, 31 January 2024.
Benjamin Herold, “Why schools need to talk about racial bias in AI-powered technologies”,
Education Week, 12 April 2022.
Bryan Walsh, “How an AI grading system ignited a national controversy in the U.K.”, Axios,
19 August 2020; and Daan Kolkman, “‘F**k the algorithm’? What the world can learn from the UK’s
A-level grading fiasco”, London School of Economics Impact Blog, 26 August 2020.
GE.24-08849