"AI in Israel's Gaza Operations: Target Selection and Controversy"

The Israeli military has reportedly been using an artificial intelligence tool called "Lavender" to identify bombing targets in Gaza, with human review of the suggested targets being cursory at best. The tool allegedly has a 10% error rate, and officials claim that human personnel often served only as a "rubber stamp" for the machine's decisions. The investigation comes amid international scrutiny of Israel's military campaign, with targeted air strikes killing foreign aid workers and leading to a humanitarian crisis in Gaza. The Israeli military denies using AI to identify terrorists but acknowledges using a database to cross-reference intelligence sources for target identification. Reports also suggest that the army systematically attacked targets in their homes, resulting in civilian casualties, and preferred to use unguided missiles, which can cause large-scale damage.
- Israel is using artificial intelligence to help pick bombing targets in Gaza, report says CNN
- 'Lavender': The AI machine directing Israel's bombing spree in Gaza +972 Magazine
- ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets The Guardian
- Israel’s war on Gaza live: Israel accused of ‘AI-assisted genocide’ in Gaza Al Jazeera English
- IDF denies report that it's using AI to build list of 37000 targets based on Hamas ties The Times of Israel
Reading Insights
0
0
3 min
vs 4 min read
81%
696 → 129 words
Want the full story? Read the original article
Read on CNN