Israeli AI System 'Lavender' Linked to High Civilian Casualties in Gaza
The artificial intelligence-driven targeting has sparked international concern and calls for a ceasefire.
- An AI system named 'Lavender' used by the Israeli military has led to a significant number of civilian casualties in Gaza, targeting individuals with minimal human oversight.
- The system, along with another program called 'Where’s Daddy?', has resulted in the deaths of thousands of Palestinian civilians, including women and children, by targeting their homes.
- Israeli intelligence officers revealed that the AI system often approved targets with a high error rate, leading to the deaths of non-militants.
- International and U.S. pressure has led to a reevaluation of the use of such AI systems, but concerns about their legality and ethical implications remain.
- The Israeli military disputes the claims, stating that their operations comply with international law and aim to minimize civilian casualties.