Gaza: Israeli Military’s Digital Tools Risk Civilian Harm

Human Rights


(Jerusalem, September 10, 2024) – The Israeli military’s use of surveillance technologies, artificial intelligence (AI), and other digital tools to help determine targets to attack in Gaza may be increasing the risk of civilian harm, Human Rights Watch said today in releasing a question and answer document about the tools. These digital tools raise grave ethical, legal, and humanitarian concerns.

The Israeli military is using four digital tools in the Gaza hostilities to estimate the number of civilians in an area prior to an attack, notify soldiers when to attack, and to determine whether a person is a civilian or a combatant, as well as whether a structure is civilian or military. Human Rights Watch found that the digital tools appear to rely on faulty data and inexact approximations to inform military actions in ways that could contravene Israel’s obligations under international humanitarian law, in particular the rules of distinction and precaution. 

“The Israeli military is using incomplete data, flawed calculations, and tools not fit for purpose to help make life and death decisions in Gaza, which could be increasing civilian harm,” said Zach Campbell, senior surveillance researcher at Human Rights Watch. “Problems in the design and use of these tools mean that, instead of minimizing civilian harm, the use of these tools could be resulting in the unlawful killing and wounding of civilians.”

These tools entail ongoing and systematic surveillance of Palestinian residents of Gaza, including data collected prior to the current hostilities, in a manner that is incompatible with international human rights law. The tools use Palestinians’ personal data to inform threat predictions, target identification, and machine learning.

Human Rights Watch relied on public statements from Israeli officials, previously unreported material published by the Israeli military, media reports, and interviews with experts and journalists, to assess the four tools, which the Israeli military has used in the hostilities in Gaza since October 7, 2023. This information, while incomplete, provides important details about how these tools work, how they were built, what data they use, and how they could support military decision-making.

These include a tool based on mobile phone tracking to monitor the evacuation of Palestinians from parts of northern Gaza, where the Israeli military ordered the entire population to leave on October 13; a tool that generates lists of buildings or other structural targets to be attacked, called “The Gospel”; a tool that assigns ratings to people in Gaza related to their suspected affiliation with Palestinian armed groups, called “Lavender,” for purposes of labeling them as military targets; and a tool called “Where’s Daddy?”, which purports to determine when a target is in a particular location – often their presumed family home, according to media reports – so they can be attacked there.

These tools are limited by issues common to other types of technology, but in military contexts they can have deadly consequences for civilians. Two of these tools, the evacuation monitoring tool and Where’s Daddy?, are apparently used to inform determining targets, troop movement, and other military actions using mobile phone location data. Although they have many practical uses in daily life, these tools are not accurate enough to inform military decisions, especially given the massive damage to communications infrastructure in Gaza.

Two of the digital tools, the Gospel and Lavender, appear to rely on processes of machine learning to inform targeting decisions using criteria developed by an algorithm that is most likely biased and incomplete, and in a process that is technically impossible to scrutinize. Algorithmic outputs often reflect the biases of their programmers and their society. And while they may appear to be neutral, digital tools are often given excessive trust by human operators even though they are only as accurate as the data they were built with, which in military contexts is often incomplete and not fully representative of the context in which the tool is operating. Relying on these algorithms risks contravening international humanitarian law obligations regarding the protection of civilians.

Lavender and the Gospel rely on machine learning to distinguish between military objectives and civilians, and civilian objects. Machine learning is a type of AI that uses computerized systems that can draw inferences from data and recognize patterns without explicit instructions. Using it to assign suspicion or inform targeting decisions can increase the likelihood of civilian harm.

It has not been possible to document when and where the Israeli military is using these digital tools or the extent to which they have used these tools in conjunction with other methods of information and intelligence collection.

The Israeli military should ensure that any use of technology in its operations complies with international humanitarian law. No targeting decisions should be made based solely on recommendations by a machine learning tool. If Israeli forces are acting upon any of these tools’ recommendations or assessments without sufficient scrutiny or additional information – as has been reported – resulting in attacks causing civilian harm, Israeli forces would be violating the laws of war. Committing serious violations of the laws of war, such as indiscriminate attacks on civilians with criminal intent, are war crimes.

Israel should also ensure that as it is an occupying power in Gaza, its use of digital tools does not violate the privacy rights of Palestinians. In May 2024, Human Rights Watch discovered data posted publicly online by the Israeli military, apparently erroneously, that included what appears to be operational data related to systems used for monitoring the evacuation and movement of people through Gaza, as well as for projecting the likely civilian harm that would be caused by attacks in particular areas. The data was included in the source code of the Israeli military’s evacuation information website and included personal information and the surnames of the most populous extended families in each block.

The Israeli army’s publication of this data online violates Palestinians’ right to privacy and demonstrates that the military is not taking adequate security precautions with the data it collects, Human Rights Watch said.

Human Rights Watch sent a letter to the Israeli military on May 13 with detailed questions, but the military has not responded.

Over the past 10 months in Gaza, over 40,000 people have been killed and 94,000 injured, according to the Gaza Health Ministry. Over 70 percent of civilian infrastructure and over 60 percent of civilian homes have been destroyed or severely damaged. Virtually all of Gaza’s residents have been displaced from their homes. Impartial investigations into the use of these digital tools is needed to determine whether and to what extent they have unlawfully contributed to the loss of civilian life and property, and the steps that are needed to prevent future harm, Human Rights Watch said.

“The use of flawed technology in any context can have negative human rights implications, but the risks in Gaza couldn’t be higher,” Campbell said. “The Israeli military’s use of these digital tools to support military decision-making should not be leading to unlawful attacks and grave civilian harm.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *