"AI-Driven Targeting: Israel's Use of Artificial Intelligence in Gaza Bombing Operations"

During the Israel-Hamas war, the IDF reportedly established a ratio allowing up to 20 civilian deaths for every low-level suspected Hamas militant killed, using an AI-based targeting system called "Lavender" to identify potential targets. The policy was described as permissive and potentially driven by revenge, with fluctuating rates of allowed collateral damage. The IDF denied the existence of a kill list containing thousands of Palestinians and stated that strikes are not carried out when the expected collateral damage is excessive in relation to the military advantage. Over 30,000 Palestinians have been killed in Gaza since the war began, including foreign aid workers and journalists.
- IDF allowed up to 20 civilian deaths per low-level suspected Hamas militant: new report Business Insider
- 'Lavender': The AI machine directing Israel's bombing spree in Gaza +972 Magazine
- ‘AI-assisted genocide’: Israel reportedly used database for Gaza kill lists Al Jazeera English
- Israel is using artificial intelligence to help pick bombing targets in Gaza, report says CNN
- ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets The Guardian
Reading Insights
0
0
2 min
vs 3 min read
82%
574 → 104 words
Want the full story? Read the original article
Read on Business Insider