The report cited six Israeli intelligence officers, who admitted to using an AI called 'Lavender' to classify as many as 37,000 Palestinians as suspected militants marking these people and their homes as acceptable targets for air strikes. Israel has vehemently denied the AI's role with an army spokesperson describing the system as 'auxiliary tools that assist officers in the process of incrimination.' Lavender was trained on data from Israeli intelligence's decades-long surveillance of Palestinian populations, using the digital footprints of known militants as a model for what signal to look for in the noise, according to the report.
Poster Comment:
They use AI to tell them which houses to bomb.