+++ title = "AI is already used for genocide" author = ["Michał Sapka"] date = 2024-04-04T21:28:00+02:00 categories = ["blog"] draft = false weight = 2001 abstract = "Israel already uses AI systems during wartime without care of civilian casualties." +++ Today, an article on use of AI systems for selecting and targeting Hamas soldiers was published by [972 magazine.](https://www.972mag.com/lavender-ai-israeli-army-gaza/) The entire article is an essential read for anyone, but one paragraph especially caught my eye: > The Lavender software analyzes information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance, then assesses and ranks the likelihood that each particular person is active in the military wing of Hamas or PIJ. > According to sources, the machine gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant." We already have an active AI system which is capable of mass surveillance of gigantic population. A system which is used to target people for extermination. A system which has 10% error rate: > [...] knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all. Think, what would have happened if your bank made an error in 1 on 10 transactions? And we are talking about human lives here. Not only is the system correct only 90% of time times, but also the results are used to target bombings where it is accepted that few hundred civilians will be killed. 1 in 10 chance, that 300 people will just for nothing. Dumb computer error. As a software engineer, I refused to work on GenAI, and at the same time someone worked on mass murder system with a random chance of being correct. Please, read the entire [article](https://www.972mag.com/lavender-ai-israeli-army-gaza/). We're already in worst-case scenario. This is the doomsday scenario we were so afraid of.