"AI in Israel's Gaza Operations: Target Selection and Controversy"

1 min read
Source: CNN
"AI in Israel's Gaza Operations: Target Selection and Controversy"
Photo: CNN
TL;DR Summary

The Israeli military has reportedly been using an artificial intelligence tool called "Lavender" to identify bombing targets in Gaza, with human review of the suggested targets being cursory at best. The tool allegedly has a 10% error rate, and officials claim that human personnel often served only as a "rubber stamp" for the machine's decisions. The investigation comes amid international scrutiny of Israel's military campaign, with targeted air strikes killing foreign aid workers and leading to a humanitarian crisis in Gaza. The Israeli military denies using AI to identify terrorists but acknowledges using a database to cross-reference intelligence sources for target identification. Reports also suggest that the army systematically attacked targets in their homes, resulting in civilian casualties, and preferred to use unguided missiles, which can cause large-scale damage.

Share this article

Reading Insights

Total Reads

0

Unique Readers

0

Time Saved

3 min

vs 4 min read

Condensed

81%

696129 words

Want the full story? Read the original article

Read on CNN