A/HRC/57/70
of Science and Technology, and the African Institute for Mathematical Sciences has
developed Masters and PhD programmes in AI, machine learning, mathematics, and data
science. Many western Big Tech firms have opened AI labs in Africa, including Microsoft in
Nairobi in 2020, Google in Accra in 2018, and IBM in Nairobi in 2018 and in Johannesburg
in 2016. He provided examples of how the past few years had seen a growing number of AI
startups across the continent, albeit isolated and small scale. In Ghana, AI and digitalization
initiatives have developed the Biometric Ghana Card, a multi-purpose biometric national
identification card for day-to-day electronic and physical transactions; 11 a digital address
system; drone delivery of medical products, and AI systems to help farmers track weather
patterns. Challenges to AI development in Africa included low investment in R&D, lack of
relevant skills, and that there was a need for supportive policies and robust infrastructure to
enable Africa to benefit fully from AI,12 he explained. He added that most African countries
lacked the financial, technological, and institutional capacity to drive AI development, and
that this was due to how the Continent’s development had been undermined by foreign,
imperialist interests, including by the international financial architecture, which contributed
to disinvestment in social sectors such as education.
24.
Joe Atkinson, University of Southampton, focused on “Human Rights at Work in the
Age of Artificial Intelligence”. He explained that governance by AI and algorithmic decisionmaking (ADM) has emerged as a new form of ‘governance by numbers’ in both public and
private sectors. ADM was being used by governments for a wide range of decisions relating
to policing, immigration, housing and social security. It was also being used by corporations
for targeted advertising and recommendations, and personal pricing. Mr. Atkinson detailed
the use of AI in automation and algorithmic management in areas such as recruitment, in prescreening, CV sifting, and interview analysis; in route planning, and scheduling allocation
(e.g., platforms/apps); in evaluation, to monitor tasks and performance, algorithmic ratings
and assessments (e.g., call centers); and to discipline, such as suspending low scoring workers,
altering access to shifts, and reliance on algorithmic metrics in dismissals and redundancies.
Automation could, eventually, lead to a level of job destruction and work scarcity that
undermines the right to work, he pointed out, in which case, policies designed to protect the
right to work would be needed. Such protections could entail limiting the automation of
specific tasks or jobs; policies that spread work across more people; and job guarantee
schemes. He further explained that tech also threatened equality at work, underlining that
algorithmic management posed a serious threat to the right to non-discrimination, and that
this could be the result of assumptions or bias of engineers; inaccurate or incomplete data,
leading to errors or biases; and replication or amplification of existing inequalities. This
problem was compounded by a lack of transparency and accountability over the design and
implementation of algorithmic tools. Tech also posed a threat to the right to just and fair
working conditions in numerous ways. Algorithmic management undermined just conditions
by enabling avoidance of employment law protections. Surveillance and intensification of
work creates health and safety risks. It heightens the level of control over and subordination
of workers. It also enables the deskilling of work, he added, emphasizing that the overall
effect was to recommodify and dehumanize work.
25.
Professor Isak Nti Asare, Center for Applied Cybersecurity Research at Indiana
University, drawing from Johan Galtung’s 1969 work, 13 focused on the application of
positive peace in building governance mechanisms for AI and emerging technologies. He
underscored the understanding that technology is a product of the underlying society from
which it is conceived, which includes attitudes, structures and institutions. He warned that
inequalities in contemporary technological tools and systems are predicated on the
consolidation of power in the digital economy among a few tech companies, and that we
could not expect anything less from such a primordial environment of structural inequality.
He expounded this fundamental relationship by emphasizing that the current global focus on
a paradigm of harm mitigation within the digital ecosystem addressed the symptoms of an
11
12
13
See https://register.nia.gov.gh/.
United Nations Economic Commission for Africa.
Johan Galtung, “Violence, Peace and Peace Research”, Journal of Peace Research, Vol 6:3 (1969):
168.
7