summaryrefslogtreecommitdiff
path: root/content/blog/2024/ai-genocide.md
blob: e74c8895ce31fce5249c2838580557577da8f8fe (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
+++
title = "AI is already used for genocide"
author = ["Michał Sapka"]
date = 2024-04-04T21:28:00+02:00
categories = ["blog"]
draft = false
weight = 2001
abstract = "Israel already uses AI systems during wartime without care of civilian casualties."
+++

Today, an article on use of AI systems for selecting and targeting Hamas soldiers was published by [972 magazine.](https://www.972mag.com/lavender-ai-israeli-army-gaza/)
The entire article is an essential read for anyone, but one paragraph especially caught my eye:

> The Lavender software analyzes information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance, then assesses and ranks the likelihood that each particular person is active in the military wing of Hamas or PIJ.
> According to sources, the machine gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant."

We already have an active AI system which is capable of mass surveillance of gigantic population.
A system which is used to target people for extermination.
A system which has 10% error rate:

> [...] knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.

Think, what would have happened if your bank made an error in 1 on 10 transactions?
And we are talking about human lives here.

Not only is the system correct only 90% of time times, but also the results are used to target bombings where it is accepted that few hundred civilians will be killed.

1 in 10 chance, that 300 people will just for nothing.
Dumb computer error.

As a software engineer, I refused to work on GenAI, and at the same time someone worked on mass murder system with a random chance of being correct.

Please, read the entire  [article](https://www.972mag.com/lavender-ai-israeli-army-gaza/).
We're already in worst-case scenario.
This is the doomsday scenario we were so afraid of.