Israel's military acknowledged harm to Palestinian civilians at Gaza aid sites, stating that lessons have been learned and new instructions issued after incidents of inaccurate artillery fire, amid ongoing concerns over the safety and neutrality of aid operations in Gaza.
Israel's military has been using an AI target-creation platform called "the Gospel" to select bombing targets in Gaza during its offensive against Hamas. The platform, which uses machine learning and advanced computing, has significantly accelerated the production of targets, with the IDF claiming to have identified over 12,000 targets in Gaza. The Gospel has been used to produce automated recommendations for attacking targets, including the private homes of suspected Hamas or Islamic Jihad operatives. Concerns have been raised about the risks posed to civilians as advanced militaries increasingly rely on complex and opaque automated systems on the battlefield.
The cluster munitions that the U.S. is sending to Ukraine often fail to detonate, posing a lasting danger to civilians. The Pentagon claims that the weapons have been improved to minimize civilian harm, but their own statements reveal that the cluster munitions contain older grenades with a failure rate of 14 percent or more. These grenades can remain unexploded for years or even decades, posing a threat to adults and children who come across them. While the Biden administration had reservations about supplying cluster munitions, they ultimately decided to provide them to Ukraine due to the country's shortage of artillery rounds. Cluster munitions are banned by over 100 countries due to the high risk of civilian casualties.