The Israeli military’s use of artificial intelligence to identify potential targets in its conflict with Hamas in Gaza has sparked international concern. A report from Israeli magazine +972 and Hebrew-language outlet the Local Call claimed that Israel used an AI program called “Lavender” to target up to 37,000 Palestinians linked to Hamas, with little human oversight. This has led to questions about the ethical implications of delegating life and death decisions to algorithms, with U.N. Secretary General Antonio Guterres expressing deep concern about the reports.

The United States has not independently verified the report, but officials are looking into it. The use of AI in targeting decisions has raised alarm among experts and human rights advocates, with concerns about the potential for increased civilian casualties. The Israeli Defense Forces have denied using artificial intelligence in target identification, stating that they use multiple tools in the process and that analysts must independently conclude whether a suspected target is relevant.

The +972 report alleged that Israel used the AI system to develop a list of Palestinian targets, resulting in increased civilian casualties during strikes. The report also claimed that Israeli officials deemed it acceptable to kill between 15 and 20 civilians for every target, which could potentially amount to war crimes. The IDF, however, maintains that the system is simply a database used to cross-reference intelligence sources and that they do not carry out strikes if the expected collateral damage is excessive in relation to the military advantage.

Israel’s military policy has faced scrutiny following a strike that killed aid workers with the World Central Kitchen. José Andrés, the organization’s founder, accused Israel of targeting the workers’ vehicles systematically. President Joe Biden has called on Israeli Prime Minister Benjamin Netanyahu to do more to protect civilians in the conflict with Hamas, which has resulted in thousands of deaths in Gaza. The ongoing conflict has led to increased pressure on Israel to adhere to international law and protect civilians in its military operations.

The use of artificial intelligence in targeting decisions highlights the evolving nature of warfare and the ethical dilemmas posed by advanced technology. As world leaders and human rights advocates continue to scrutinize Israel’s military actions in Gaza, the need for accountability and transparency in conflict zones becomes increasingly important. The international community will be closely monitoring developments in the situation to ensure that civilian lives are protected and that all parties involved adhere to international humanitarian law.

Share.
Exit mobile version