A/HRC/56/68
algorithm designer chooses to define “success” can have an impact on what the algorithm
seeks to maximize. If an algorithm is set to optimize for a minimum number of new offences,
it may correlate longer sentences with lower reoffending rates, because people cannot
reoffend while incarcerated. It can then use those patterns to recommend longer sentences.
36.
Researchers have suggested that recidivism predictors are not accurate and that their
errors have a disproportionate impact on racially marginalized groups. For example, a study
in the United States found that risk scores were very unreliable in their forecasting of violent
crime. People of African descent were reportedly mislabelled as future criminals at almost
twice the rate of white individuals.
(d)
Autonomous weapon systems
37.
Autonomous weapon systems include any weapon systems with autonomy in their
critical functions, including lethal autonomous weapons and less-lethal weapons. They have
applications in law enforcement, as well as military, contexts and remain largely unchecked.
These systems can select, detect, identify and attack targets without human intervention. An
autonomous weapon is triggered by sensors and software that match a person with a “target
profile” as determined by the system’s algorithm. Autonomous weapon systems have very
serious human rights implications, including relating to the right to life, the prohibition of
torture and other ill-treatment and the right to security of person.55
38.
The First Committee of the General Assembly heard that the window of opportunity
to enact guardrails against the perils of autonomous weapons and artificial intelligence’s
military applications was rapidly closing as the world prepared for a “technological
breakout”.56 The Special Rapporteur on extrajudicial, summary or arbitrary executions has
previously recommended that the Human Rights Council call upon all States to declare and
implement national moratoriums on at least the testing, production, assembly, transfer,
acquisition, deployment and use of lethal autonomous robotics. 57
39.
There is a serious risk of grave and, in some circumstances, deadly racial
discrimination resulting from the use of autonomous weapon systems. The criteria used to
select targets likely include gender, age and race.58 Target profiles also include seemingly
neutral criteria, such as weight or heat signatures, but the machines often reflect the biases of
their programmers and society. They can also be programmed with intentionally
discriminatory target profiles.59 For example, Israel is reportedly using lethal autonomous
and semi-autonomous weapon systems. This reportedly includes the use of remote-controlled
quadcopters to target Palestinians, in addition to automated target generation systems,
operating at unparalleled speed and volume, to produce “kill lists”. 60 The Gospel and
Lavender, two artificial intelligence technology systems used by the Israel Defense Forces,
are reported to have intensified the levels of destruction in Gaza, resulting in significant
causalities, in particular among Palestinian women and children.61
55
56
57
58
59
60
61
10
Amnesty International, “Autonomous weapons systems: five key human rights issues for
consideration” (April 2015), p. 5.
United Nations, “Without adequate guardrails, artificial intelligence threatens global security in
evolution from algorithms to armaments, speaker tells First Committee”, 24 October 2023.
A/HRC/23/47, para. 113.
Ray Acheson, “Gender and bias”, available at https://www.stopkillerrobots.org/wpcontent/uploads/2021/09/Gender-and-Bias.pdf.
Bonnie Docherty, “Expert Panel on the Social and Humanitarian Impact of Autonomous Weapons at
the Latin American and Caribbean Conference on Autonomous Weapons”, Human Rights Watch,
8 March 2023.
Marwa Fatafta and Daniel Leufer, “Artificial genocidal intelligence: how Israel is automating human
rights abuses and war crimes”, Access Now, 9 May 2024.
Yuval Abraham, “‘Lavender’: the AI machine directing Israel’s bombing spree in Gaza”,
+972 Magazine, 3 April 2024.
GE.24-08849