"Israel's 'Lavender' AI Program Identifies Thousands of Bombing Targets in Gaza"

TL;DR Summary
The Israeli military reportedly utilized a secretive AI program called "Lavender" to identify thousands of bombing targets in Gaza, despite a 10% error rate. The program allegedly identified 37,000 potential terrorists, leading to a high number of low-level alleged Hamas operatives being targeted. The military has denied the claims, stating that "Lavender" is simply a database for cross-referencing intelligence sources. The US is looking into the report, while the Gaza Health Ministry reports 33,000 Palestinians killed in the conflict in the past six months.
- Israel used secretive AI program called 'Lavender' to identify thousands of bombing targets: report New York Post
- Analysis | Israel offers a glimpse into the terrifying world of military AI The Washington Post
- Google Won't Say Anything About Israel Using Its Photo Software to Create Gaza “Hit List” The Intercept
- 'Lavender': The AI machine directing Israel's bombing spree in Gaza +972 Magazine
- 'The machine did it coldly': Israel used AI to identify 37000 Hamas targets The Guardian
Reading Insights
Total Reads
0
Unique Readers
1
Time Saved
2 min
vs 3 min read
Condensed
86%
580 → 84 words
Want the full story? Read the original article
Read on New York Post