According to a new report by Israel-based publications, the Israeli military allegedly used a secretive AI program called “Lavender” to identify thousands of bombing targets in Gaza. Despite having a 10% error rate, the mass surveillance system identified 37,000 potential terrorists, including many low-level alleged Hamas operatives who would not typically be targeted. The program reportedly received little human review, according to six intelligence officials who were involved during the conflict with Hamas.

Two sources claimed that the Israeli military accepted the “collateral damage” of the AI program, with as many as 20 civilians being killed for each junior operative found by “Lavender.” In one instance, over 100 civilians were allegedly killed in pursuit of a single senior Hamas official. Sources revealed that there were loose regulations surrounding the collateral damage caused by the program, with high double-digit to low triple-digit civilian casualties being reported for specific targets.

The Israeli military has strongly denied the explosive claims made in the report. The IDF stated that “Lavender” is simply a database used to cross-reference intelligence sources to produce up-to-date information on the military operatives of terrorist organizations. The IDF clarified that this is not a list of confirmed military operatives eligible for attack, and each target identified must undergo an individual assessment to determine the military advantage and potential collateral damage expected.

In response to the report, the White House has stated that they are looking into the allegations, which have not been verified yet. The Hamas-run Gaza Health Ministry has reported that 33,000 Palestinians have been killed in the conflict over the past six months. UN data shows that in the first month of the war alone, 1,340 families suffered multiple losses, with 312 families losing more than 10 members. The impact of the conflict on civilians, as well as the use of AI technology in targeting operations, has raised significant concerns globally.

The report sheds light on the controversial use of AI technology by the Israeli military during their conflict with Hamas in Gaza. The identification of bombing targets using the “Lavender” program reportedly led to a high number of civilian casualties, with little oversight or review by human operators. The allegations have prompted strong denial from the Israeli military, who claim that the AI program is used for intelligence cross-referencing and individual assessments are conducted for each target. The international community, including the White House, is now investigating the claims made in the report.

The use of AI technology in military operations, especially in conflict zones, has raised ethical concerns about civilian casualties and collateral damage. The reported targeting of low-level operatives by the Israeli military’s AI program, resulting in a significant number of civilian deaths, has sparked outrage and calls for accountability. The conflicting narratives from the Israeli military and the report’s sources highlight the challenges of verifying the use of AI technology in warfare and ensuring compliance with international law. As investigations continue into the allegations, the Israeli military’s use of AI in targeting operations will likely face greater scrutiny and debate in the international community.

Share.
Exit mobile version