summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authormms <michal@sapka.me>2024-04-04 21:35:57 +0200
committermms <michal@sapka.me>2024-04-04 21:35:57 +0200
commitec4e9e3b6a4d12a4bcd03b0693307c7c0a930de3 (patch)
tree3e1bd8319409fc06c73837d38ef93d6465951e9d
parent18e872c103b2612e33621d4083a3b9067bfa4e51 (diff)
feat: ai genocide
-rw-r--r--.gitignore2
-rw-r--r--content-org/blog.org44
-rw-r--r--content/blog/2024/ai-genocide.md35
3 files changed, 78 insertions, 3 deletions
diff --git a/.gitignore b/.gitignore
index e29891b..df920cc 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,3 +1,5 @@
.DS_Store
.hugo_build.lock
public
+content/star-trek/stv/05x19-fight.md
+resources/_gen/images/ytcovers/z5Wj5GOiJYg_hu6e07be8a1ab2ca78f63023996f8c8c8f_21076_0x600_resize_q90_h2_box.webp
diff --git a/content-org/blog.org b/content-org/blog.org
index c277486..8f5a331 100644
--- a/content-org/blog.org
+++ b/content-org/blog.org
@@ -7,12 +7,48 @@
#+HUGO_WEIGHT: auto
#+HUGO_SECTION: blog
-* 2024 [26/27] :@blog:
+* 2024 [27/27] :@blog:
:PROPERTIES:
:EXPORT_HUGO_SECTION: blog/2024
:END:
-** TODO Don McMillan
-https://www.youtube.com/watch?v=kwz-Md6OoyA
+** DONE AI is already used for genocide
+CLOSED: [2024-04-04 Thu 21:28]
+:PROPERTIES:
+:EXPORT_FILE_NAME: ai-genocide
+:EXPORT_HUGO_CUSTOM_FRONT_MATTER: abstract Israel already uses AI systems during wartime without care of civilian casualties.
+:EXPORT_HUGO_PAIRED_SHORTCODES: img-r
+:END:
+
+Today, an article on use of AI systems for selecting and targeting Hamas soldiers was published by [[https://www.972mag.com/lavender-ai-israeli-army-gaza/][972 magazine.]]
+The entire article is an essential read for anyone, but one paragraph especially caught my eye:
+
+#+begin_quote
+The Lavender software analyzes information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance, then assesses and ranks the likelihood that each particular person is active in the military wing of Hamas or PIJ.
+According to sources, the machine gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant."
+#+end_quote
+
+We already have an active AI system which is capable of mass surveillance of gigantic population.
+A system which is used to target people for extermination.
+A system which has 10% error rate:
+
+#+begin_quote
+[...] knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.
+#+end_quote
+
+Think, what would have happened if your bank made an error in 1 on 10 transactions?
+And we are talking about human lives here.
+
+Not only is the system correct only 90% of time times, but also the results are used to target bombings where it is accepted that few hundred civilians will be killed.
+
+1 in 10 chance, that 300 people will just for nothing.
+Dumb computer error.
+
+As a software engineer, I refused to work on GenAI, and at the same time someone worked on mass murder system with a random chance of being correct.
+
+Please, read the entire [[https://www.972mag.com/lavender-ai-israeli-army-gaza/][article]].
+We're already in worst-case scenario.
+This is the doomsday scenario we were so afraid of.
+
** DONE Absolute FreeBSD
CLOSED: [2024-03-28 Thu 22:08]
:PROPERTIES:
@@ -1887,3 +1923,5 @@ Wikpedia has a list of /controvesion subjecs/ https://en.wikipedia.org/wiki/Wiki
** New things make me sad
+** TODO Don McMillan
+https://www.youtube.com/watch?v=kwz-Md6OoyA
diff --git a/content/blog/2024/ai-genocide.md b/content/blog/2024/ai-genocide.md
new file mode 100644
index 0000000..e74c889
--- /dev/null
+++ b/content/blog/2024/ai-genocide.md
@@ -0,0 +1,35 @@
++++
+title = "AI is already used for genocide"
+author = ["Michał Sapka"]
+date = 2024-04-04T21:28:00+02:00
+categories = ["blog"]
+draft = false
+weight = 2001
+abstract = "Israel already uses AI systems during wartime without care of civilian casualties."
++++
+
+Today, an article on use of AI systems for selecting and targeting Hamas soldiers was published by [972 magazine.](https://www.972mag.com/lavender-ai-israeli-army-gaza/)
+The entire article is an essential read for anyone, but one paragraph especially caught my eye:
+
+> The Lavender software analyzes information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance, then assesses and ranks the likelihood that each particular person is active in the military wing of Hamas or PIJ.
+> According to sources, the machine gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant."
+
+We already have an active AI system which is capable of mass surveillance of gigantic population.
+A system which is used to target people for extermination.
+A system which has 10% error rate:
+
+> [...] knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.
+
+Think, what would have happened if your bank made an error in 1 on 10 transactions?
+And we are talking about human lives here.
+
+Not only is the system correct only 90% of time times, but also the results are used to target bombings where it is accepted that few hundred civilians will be killed.
+
+1 in 10 chance, that 300 people will just for nothing.
+Dumb computer error.
+
+As a software engineer, I refused to work on GenAI, and at the same time someone worked on mass murder system with a random chance of being correct.
+
+Please, read the entire [article](https://www.972mag.com/lavender-ai-israeli-army-gaza/).
+We're already in worst-case scenario.
+This is the doomsday scenario we were so afraid of.