[HN Gopher] 'Lavender': The AI machine directing Israel's bombin...
___________________________________________________________________
'Lavender': The AI machine directing Israel's bombing in Gaza
Author : contemporary343
Score : 774 points
Date : 2024-04-03 14:50 UTC (8 hours ago)
(HTM) web link (www.972mag.com)
(TXT) w3m dump (www.972mag.com)
| hindsightbias wrote:
| "Because of the system, the targets never end."
|
| The future is now.
| ourguile wrote:
| The purpose of a system is what it does. :(
| prpl wrote:
| Endless scrolling feed
| malfist wrote:
| There is no justification for killing noncombatants, even if AI
| told you you could.
| basil-rash wrote:
| Wild that this is still a controversial statement on HN, which
| is otherwise rather forward thinking.
| hugodan wrote:
| There is no justification for killing.
| twojacobtwo wrote:
| There are some justifications for killing. Like if you can
| save many lives by killing one. But in general, I agree with
| you.
| hugodan wrote:
| I disagree with you. There is no justification for death.
|
| 1) Where do you draw the line? 2) At what number does that
| one become two? 3) how long do you think until AI is
| justified to start killing those single digit persons?
|
| 4) What if that one person is you? (this is not that hard
| to imagine, suppose a fictitious near future where everyone
| that contributed to some extinction event is deemed
| killable: AI development, global warming, failed to do some
| recycling, etc).
| medvezhenok wrote:
| Presupposing infinite resources, there wouldn't be a
| justification for death per se - since there would always
| be a better, more humane option. The world, however, does
| not have infinite resources, so there is always a
| question of optimal allocation, which will involve
| questions of life and death too.
|
| (not talking about this conflict in particular, just
| making an abstract point)
| stale2002 wrote:
| > 1) Where do you draw the line?
|
| Well the line would be at when you are causing more
| deaths than you are saving.
|
| Would you rather a larger number of people die?
|
| > What if that one person is you?
|
| What if the people's lives that would be saved are you,
| and this number is much larger?
|
| That argument actually works in favor of the option that
| saves the most lives.
|
| There is no neutral decision here. If you choose to not
| save the much larger group of people, those people are
| dead.
|
| So your only choice is to pick which groups of people
| will die. My prefer is to minimize that amount to be as
| small as possible. But if you want that number to be
| larger, and to have more people die, that requires some
| explanation.
| XorNot wrote:
| This is not what the article is about, and not what AI was
| being used for.
| rany_ wrote:
| Read between the lines, they're trying to blame their AI for
| the civilian casualties.
| spuz wrote:
| The use of AI and the authorisation to kill civilians are
| unrelated parts of this story. Nowhere does it mention that the
| AI is being used to justify killing of civilians.
| rany_ wrote:
| Yeah, because they need to spell out what they're trying to
| have you infer.
| jijji wrote:
| that would explain the news today of how Israel killed seven aid
| workers in Gaza [0]
|
| [0] https://www.reuters.com/world/middle-east/what-we-know-so-
| fa...
| ceejayoz wrote:
| Shades of https://www.nytimes.com/2012/05/29/world/obamas-
| leadership-i....
|
| > It is also because Mr. Obama embraced a disputed method for
| counting civilian casualties that did little to box him in. It
| in effect counts all military-age males in a strike zone as
| combatants, according to several administration officials,
| unless there is explicit intelligence posthumously proving them
| innocent.
|
| > Counterterrorism officials insist this approach is one of
| simple logic: people in an area of known terrorist activity, or
| found with a top Qaeda operative, are probably up to no good.
| "Al Qaeda is an insular, paranoid organization -- innocent
| neighbors don't hitchhike rides in the back of trucks headed
| for the border with guns and bombs," said one official, who
| requested anonymity to speak about what is still a classified
| program.
| arp242 wrote:
| In the case of Al Qaeda, that might actually have been true?
| I don't think you can really compare Hamas to Al Qaeda;
| almost everything meaningful is different.
| ceejayoz wrote:
| > In the case of Al Qaeda, that might actually have been
| true?
|
| Very clearly not, as admitted by the man himself.
| https://www.pbs.org/newshour/world/obama-says-u-s-drone-
| stri...
| arp242 wrote:
| That doesn't mention Al Qaeda? It just talks about drone
| strikes against ISIS, which is yet again quite a
| different organisation than Al Qaeda and Hamas.
| ceejayoz wrote:
| So ISIS gives "hitchhike rides [to innocent neighbors] in
| the back of trucks headed for the border with guns and
| bombs"?
|
| If you want Al Qaeda-specific cases, they take about
| three seconds to find.
| https://www.washingtonpost.com/world/2023/05/18/pentagon-
| dro..., for example.
|
| edit: The Yemen case cited in my link above was AQ;
| https://www.hrw.org/report/2014/02/19/wedding-became-
| funeral...
|
| "They were an adult male near a target" is not a safe way
| of determining guilt for capital crimes. We should not
| accept it.
| arp242 wrote:
| What is your point even? All I said is that you can't
| compare Al Qaeda and Hamas, and how they operate, and how
| to combat them. I never said that US drone strikes
| were/are 100% perfect, or even that I liked the entire
| programme.
| ceejayoz wrote:
| My point is "if they're near a target they're a target"
| is an insane standard to use for these sorts of strikes,
| and the article this entire HN discussion is about makes
| it pretty clear such a standard is in use in Gaza right
| now.
|
| > This was despite knowing that the system makes what are
| regarded as "errors" in approximately 10 percent of
| cases, and is known to occasionally mark individuals who
| have merely a loose connection to militant groups, or no
| connection at all.
|
| > Moreover, the Israeli army systematically attacked the
| targeted individuals while they were in their homes --
| usually at night while their whole families were present
| -- rather than during the course of military activity.
|
| > "We were not interested in killing [Hamas] operatives
| only when they were in a military building or engaged in
| a military activity," A., an intelligence officer, told
| +972 and Local Call. "On the contrary, the IDF bombed
| them in homes without hesitation, as a first option. It's
| much easier to bomb a family's home. The system is built
| to look for them in these situations."
| RUnconcerned wrote:
| They didn't want to kill the aid workers, but the evil AI made
| them do it.
| supposemaybe wrote:
| And they would have lived too, if it weren't for that pesky
| AI!!
| jijji wrote:
| here's another story today from France24 about the over-
| reliance on AI driven targetting may have been responsible for
| the airstrike on April 1st 2024 that killed seven aid workers
| in Gaza [0]
|
| [0] https://www.france24.com/en/middle-east/20240403-gaza-aid-
| wo...
| dw_arthur wrote:
| _Two sources said that during the early weeks of the war they
| were permitted to kill 15 or 20 civilians during airstrikes on
| low-ranking militants. Attacks on such targets were typically
| carried out using unguided munitions known as "dumb bombs", the
| sources said, destroying entire homes and killing all their
| occupants._
|
| The world should not forget this.
| prpl wrote:
| So the entire family and neighbors family.
|
| Sure would be convenient if Hamas is 6% of the population
| gryzzly wrote:
| convenient how, you mean?
| bregma wrote:
| The result would be plenty of fresh unoccupied land to
| settle on. Just a little bit of cleanup required.
| gryzzly wrote:
| do I read your tone right, and you suggest that would be
| a reason to celebrate for someone? for whom? you believe
| the aim of the Israeli military action is territory?
| Qem wrote:
| https://www.bbc.com/news/world-middle-east-68650815
| pelorat wrote:
| 40% or something voted for them, and pretty sure all of those
| are considered targets now.
| anigbrowl wrote:
| I don't think basing your ROE on the results of an election
| that took place in 2006 is a valid approach.
| myth_drannon wrote:
| So why it didn't happen? 40000 operatives, x30 family members
| would mean the entire Gaza population is gone in a matter of
| weeks.
| dariosalvi78 wrote:
| Definitely Palestinians are not going to forget this.
| tjpnz wrote:
| I would extend that to the wider region.
| supposemaybe wrote:
| Em, I think you mean any reasonable minded human that walks
| the planet.
| flir wrote:
| I think he means the cycle of violence will continue.
|
| Which is what I kinda assume Hamas wanted in the first
| place.
| koutetsu wrote:
| Could you please clarify what you mean by "Hamas wanted
| in ghe first place"? If I'm not mistaken, you're
| referring to the attack on the 7th of October, right? May
| I perhaps add that just on the days preceding that
| attack, Israelis killed a Palestinian in the West
| Bank[0]. So it was not really peaceful before that
| specific date.
|
| [0] https://www.reuters.com/world/middle-
| east/palestinian-killed...
| Workaccount2 wrote:
| This is a dumb road to go down because the finger
| pointing is almost infinite. This conflict has been very
| active for decades now.
| koutetsu wrote:
| I wasn't necessarily trying to point fingers at a
| specific party. I wanted to better understand the
| parent's comment and while doing I wrote what I assumed
| was meant by them. I agree that to solve this issue that
| has been going on for many, many years we will have to go
| to the root cause and address that.
| acdha wrote:
| Yes, but specifically the Palestinian impact is why it's
| such a terrible policy for Israel unless you assume their
| goal is perpetual war. Most people do not want to kill
| other people but each innocent killed like this is leaving
| behind friends, family, and neighbors who will want
| vengeance and some fraction of them will decide they need
| to resort to violence because the other mechanisms aren't
| being used. Watching this happen has been incredibly
| depressing as you can pretty much mathematically predict a
| revenge period measured in decades.
| KingMob wrote:
| This assumes they're going to leave enough people alive
| to even enact vengeance. If they murder everyone, than
| there's no need to worry about any Gazan revenge; there
| will be no Gazans.
| acdha wrote:
| Technically possible, yes, but that's increasing the
| death toll from 33k to 2,300k. I don't think that's
| plausible.
| NickC25 wrote:
| It's very plausible. Keep in mind that from the get-go,
| the major global powers, (including Russia!) have adopted
| the mindset of _Israel can do no wrong, and we can 't
| criticize them at all_
|
| Israel could glass the entire Gaza strip and the reaction
| would be a slap on the wrist at best.
| Qem wrote:
| There's millions of Palestinians living in the West Bank
| or as refugees abroad, expelled or descended from those
| expelled in previous rounds of ethnic cleansing. Even if
| IDF go final solution on the 2 million Palestinians
| living in Gaza ghetto, this will not be the end of all
| Palestinians or the Palestinian struggle. See:
| https://en.m.wikipedia.org/wiki/Palestinian_diaspora
| morkalork wrote:
| "Our system is 90% accurate if you don't count the 15-20
| innocent people taken out for each hit". I know they're
| measuring the accuracy of target identification but that's
| laughable when used in this context.
|
| For 100 targets, 90 are 'correct', plus 20x civs per-target is
| 90/2100 or 4% real accuracy.
|
| Say you use a model that's only 50% accurate and limit yourself
| to 10 civs per-target, you're at 50/1100 or 4.5% accuracy!
|
| I guess my point is that no self-respecting datascient would
| release a 50% accurate model, let alone one used to make life
| or death decisions and yet, in the application of this model,
| decisions made by humans about its use has made it no better
| than doing exactly that.
| beefnugs wrote:
| These kinds of accurate numbers of acceptably killed
| innocents is really hurting a specific part of my sympathy
| brain somehow.
|
| "we really need to missile this guy or he will kill more" vs
| "well we got 37 badies and also kim and yashonda, damn i
| really liked yashonda"
|
| Actually after writing this my mind went farther, "since
| yashonda was a good person we actually have a whole bunch of
| hard facts about how good a person she actually was, did a
| lot of help for her community and was a real pillar of
| helping the next generation of kids be less violent...too bad
| we didn't add any of that info into the kill-algorithm "
| TheGeminon wrote:
| With 37,000 Palestinians marked as suspected militants, it
| would mean they expected up to 555,000 - 740,000 civilian
| casualties.
| mulmen wrote:
| How did you arrive at these numbers?
| magicalhippo wrote:
| Not GP but:
|
| > Lavender listed as many as 37,000 Palestinian men
|
| > they were permitted to kill 15 or 20 civilians during
| airstrikes
|
| 37000 * 15 = 555000 37000 * 20 = 740000
| Qem wrote:
| They claim the system has 90% accuracy, so they would
| have to actually kill about 10% more people than these
| numbers, to offset the 10% error rate. So between 610500
| and 814000. The whole Gaza strip had about 2 million
| people before the current siege.
| NovemberWhiskey wrote:
| The law of armed conflict acknowledges that civilian deaths are
| inevitable, and only prohibits attacks that are directed at
| civilians; rather than those which are directed at combatants
| with expected civilian casualties as collateral damage.
|
| The legal question is whether the civilian casualties are
| proportional to the concrete military value of the target.
|
| A question that's worth considering is whether, when
| considering proportionality, all civilians (as defined by law)
| are made equal in a moral sense.
|
| For example, the category "civilian" includes munitions workers
| or those otherwise offering support to combatants on the one
| hand, and young children on the other. It also includes members
| of the civil population who are actually involved in
| hostilities without being a formal part of an armed force.
|
| The law of armed conflict doesn't distinguish these; albeit
| that I think people might well distinguish, on a moral level,
| between casualties amongst young children, munitions workers,
| and informal combatants.
| bluish29 wrote:
| > For example, the category "civilian" includes munitions
| workers or those otherwise offering support to combatants on
| the one hand, and young children on the other. It also
| includes members of the civil population who are actually
| involved in hostilities without being a formal part of an
| armed force.
|
| I wonder if you would say the same on the other side where
| every male or female above 18 years is required to serve in
| thr military and in the reserve afterwards? [1]
|
| By your argument would you say that all of these are
| legitimate targets?
|
| [1] https://en.m.wikipedia.org/wiki/Conscription_in_Israel
| bawolff wrote:
| > I wonder if you would say the same on the other side
| where every male or female above 18 years is required to
| serve in thr military and in the reserve afterwards? [1]
|
| I don't think anything in the grandparent post suggested
| that. If someone used to be a combatant and then ceased
| fighting, usually they then become a civilian. They don't
| stay a combatant for life. Reserve forces not on duty are
| not generally combatants. You have to be in the fight to be
| a combatant.
|
| Things get more complicated with combatants who don't fully
| wear uniforms, which is why failing to wear a uniform is a
| war crime.
|
| It should be noted this isn't so much the grandparent's
| personal opinion as they are just paraphrasing what the
| geneva convention says. However there is of course a lot
| more details to it then that and the devil is in the
| details.
|
| [Edit: i think i read the post too quickly. The grandparent
| is incorrect when saying "[Civilians] also includes members
| of the civil population who are actually involved in
| hostilities without being a formal part of an armed
| force.". If you pick up a gun and start shooting the other
| side, you are not a civilian. It doesn't matter whether you
| are formally part of the armed forces. Civilians get
| protected because we want to protect the innocents stuck in
| the middle. People who are taking part in a war dont get
| that protection]
| NovemberWhiskey wrote:
| > _If you pick up a gun and start shooting the other
| side, you are not a civilian._
|
| You're not a civilian while you're holding the gun, but
| you are once you stop shooting again: you lose your
| protection as a civilian during your period of direct
| participation. Should have been more clear on that.
|
| It's probably also worth saying that -- while there's a
| degree of subtlety and complexity when considering the
| legal and moral position of Israel's armed forces --
| there's very little to debate when it comes to actions
| like the Re'im music festival attack. That kind of action
| is obviously illegal and morally repugnant.
| whythre wrote:
| Dropping the gun is not sufficient to claim civilian
| status. Military bases are full of soldiers that may not
| be armed, or even awake. That lack of a gun does not
| suddenly grant them civilian status.
| NovemberWhiskey wrote:
| That's not what I said: I said that civilians who engage
| in fighting lose protection as civilians. Members of
| armed forces, whether currently armed or not, are
| legitimate targets (with certain exceptions; like the
| wounded, those who have surrendered etc).
| bluish29 wrote:
| > while there's a degree of subtlety and complexity when
| considering the legal and moral position of Israel's
| armed forces
|
| No, there is no such complexity. There are very obviously
| undebatable incidents of war crimes by the IDF. Like this
| footage from a drone who deliberately killed civilians in
| plain sight and trying to cover the bodies[1] and the IDF
| targeting aid workers in a location they knew about [2].
| Also, there are widespread videos by IDF soldiers
| committing atrocities and crimes in Gaza and posting it
| on social media. That is hardly self-defense. This is
| obvious war crimes against civilians. Not to mention the
| mass starvation and carpet bombing of civilians. There is
| very little to debate, and denying them is immoral. You
| are just using a very old tactic of trying to minimize
| IDF crimes by claiming their position is complex.
| Remember the old say "Middle East is complex mess, let's
| just ignore what is happening there"
|
| [1] https://www.aljazeera.com/program/newsfeed/2024/3/22/
| gaza-dr...
|
| [2] https://www.cbsnews.com/news/central-world-kitchen-
| aid-worke...
| bawolff wrote:
| The aid worker one is probably the most undebatable one,
| but it also just happened. How to judge it depends on
| what happens next. Part of the assumption of war is that
| it involves people, some of whom are going to be bad -
| The expectation isn't that a country is perfect, but that
| it takes steps to prevent war crimes and punish the
| perpetrators when it happens. We don't know yet whether
| or not Israel will charge the people involved in the aid
| worker bombing.
|
| Some of the other things you mention have a lot of grey
| area, because whether or not they are a war crime don't
| necessarily depend solely on what happened, but on what
| Israel's intent was and what they knew at various points
| in time. Which is information that's hard to know from
| our vantage point. Some of them could be, but there is
| also potential that they might not be. Its not as clear
| cut as you make it out to be.
| bluish29 wrote:
| > We don't know yet whether or not Israel will charge the
| people involved in the aid worker bombing
|
| In 2021, Israeli forces killed an American-Palestinian
| journalist on duty in plain sight [1] I will quote that
| from Wikipedia
|
| "Israel denied responsibility and blamed Palestinian
| militants. However, it gradually changed its narrative
| until admitted she was "accidentally" killed by Israeli
| fire, but refused to undertake a criminal investigation"
|
| and
|
| "On September 5, the IDF released the results of its own
| investigation, finding that there was a "high
| possibility" that Abu Akleh was "accidentally hit" by
| army fire, but that it would not begin a criminal
| investigation"
|
| Another example
|
| In 1996, IDF fired shells on UN compound near a village
| called Qana and caused a civilian massacre. The UN
| investigated, and Israel refused the results and did not
| punish anyone [2]. Let's give them a benefit of the
| doubt, maybe they will just learn and avoid doing it
| again. Fear not, in 2016 they give us the second Qana
| massacre [3] without anyone getting punished.
|
| And there are maybe hundred of these events which can
| establish that Israel doesn't care and IDF don't get
| punished.
|
| I also refuse the logic that Israel should investigate
| war crimes by its army. That is absurd, like waiting for
| Russia to investigate and take their words for Bucha
| massacre. IDF have very well documented war crimes in the
| past and IDF is the occupying forces of Palestine and is
| mass starving 2.3m to death in Gaza right now. Believing
| that they will carry honest investigation and punish
| their soldiers is laughable.
|
| And let's not forget to add the IDF lie, and they are
| blatant Liars. We still remember them claiming week days
| in Arabic are names of Hamas operatives [4]. Why do you
| expect us to believe them? Of course, the Israeli
| officials and cabinet members calling for violence,
| crimes against Palestinians are well known to everyone
| now (Feel free to ask me for examples).
|
| [1] https://en.wikipedia.org/wiki/Shireen_Abu_Akleh
|
| [2] https://en.wikipedia.org/wiki/Qana_massacre
|
| [3] https://en.wikipedia.org/wiki/2006_Qana_airstrike
|
| [4] https://www.france24.com/en/tv-shows/truth-or-
| fake/20231116-...
| bawolff wrote:
| > "On September 5, the IDF released the results of its
| own investigation, finding that there was a "high
| possibility" that Abu Akleh was "accidentally hit" by
| army fire, but that it would not begin a criminal
| investigation"
|
| I'm not sure what your point is here. Accidentally
| shooting someone is not a warcrime (there are details
| here in that it still could be if there is a certain
| level of negligence), and generally a criminal
| investigation would only be started if there was
| sufficient evidence in the preliminary investigation to
| suggest it was intentional.
|
| Could israel be lying about it? Sure. Militaries doing
| cover ups would hardly be a new story. But this isn't the
| (metaphorical) smoking gun you think it is.
|
| > In 1996...
|
| 1996 was quite a long time ago at this point.
|
| > I also refuse the logic that Israel should investigate
| war crimes by its army
|
| That's generally what is expected of any army under
| international law. If they don't then the higher ups
| become responsible.
|
| In the event of a failure to prosecute, then it goes to
| the ICC to investigate and charge (israel isn't a member,
| but palestine is, so anything involving palestine
| nationals or territory counts, which is basically this
| whole war. If ICC didn't have juridsiction over
| something, then the procedure is the UN is supposed to
| create a special tribunal).
|
| So its not like its solely up to israel to
| investigate/punish. That is just the first step and what
| is required for israel to comply with international law.
| If they fail to uphold their obligations there are other
| bodies to enforce albeit in practise powerful countries
| are often ignored by them.
| bluish29 wrote:
| So after I showed you examples from similar things
| happened in the past, your narrative now goes from
|
| > We don't know yet whether or not Israel will charge the
| people involved in the aid worker bombing
|
| To
|
| > Could israel be lying about it? Sure. Militaries doing
| cover ups would hardly be a new story
|
| > So its not like its solely up to israel to
| investigate/punish
|
| Thanks for showing that this discussion is not useful.
|
| PS:
|
| > 1996 was quite a long time ago at this point.
|
| So what? Holocaust was more than 80 years at this point?
| Does this make us forget this horrible history?
| NovemberWhiskey wrote:
| > _No, there is no such complexity. There are very
| obviously undebatable incidents of war crimes by the IDF.
| Like this footage from a drone who deliberately killed
| civilians in plain sight_
|
| I don't think these things are as unequivocal as you
| suggest. I mean, you're assuming those people are
| civilians. Maybe they're not. Almost certainly we will
| never know for sure, and if you can't acknowledge that
| then you're not being objective.
| bluish29 wrote:
| > I don't think these things are as unequivocal as you
| suggest. I mean, you're assuming those people are
| civilians. Maybe they're not. Almost certainly we will
| never know for sure, and if you can't acknowledge that
| then you're not being objective.
|
| I actually expected this reply from you. And expected
| that you will not see the video and will not get
| interested in the story. [1] The video shows that they
| were not armed. If you're just going to define anyone you
| kill as, maybe he was Hamas. Then of course you will kill
| everyone and claim that. You don't kill unarmed people
| walking in plain sight. If this not obvious to you, then
| you are just wanted to justify the killing of each
| Palestinian.
|
| [1] https://www.washingtonpost.com/world/2024/03/19/gaza-
| journal...
| emadabdulrahim wrote:
| Except that Israel has no business engaging in armed conflict
| or "war" on a territory they occupy and control. That's the
| only legal issue that matters. Any armed conduct by Israel in
| Gaza is by international definition deemed ILLEGAL. There's
| no right of self defense when you're the predator.
| factorialboy wrote:
| Can we please discuss the merits of this article -- role of AI in
| future conflicts -- without taking sides on any of the ongoing
| wars?
| CubsFan1060 wrote:
| I am going to bet the answer to your question is "No"
| basil-rash wrote:
| No, probably not. When the topic at hand is the selection
| criteria used to justify the killing of tens of thousands of
| civilians, your stance on whether the ones killing tens of
| thousands of civilians are justified in doing so is rather
| intrinsic.
| gizmo686 wrote:
| I'm not sure that is possible. The nature and limitations of
| current AI technology means that it is almost impossible to
| talk about it without coming to certain conclusions about the
| party using it.
|
| To put it bluntly, useing AI to decide on targets for lethal
| operations in unconsiounable given the current and forseable
| state of technology.
|
| Come back to me when it can be trusted to make mortgage
| eligability questions without engaging in what would be
| blatantly illegal discrimination if not laundered by a computer
| algorithm.
| harimau777 wrote:
| The issue as I see it is that the tools available don't just
| determine how a given war is fought, they also determine
| whether it is fought at all.
|
| If Israel wasn't able to use tools like this, then it probably
| wouldn't be viable for them to identify much of Hamas (that's
| kind of the point of guerilla warfare). Since that would make
| it difficult to fight a war efficiently, they would be more
| likely to engage in diplomacy.
| raxxorraxor wrote:
| Very doubtful. There is no room for any diplomacy after such
| an attack. It would be fought with more primitive weapons and
| the side with more bombs would prevail.
| mempko wrote:
| Why not both? Taking a side does not mean you are clouded in
| judgement on this point.
| random9749832 wrote:
| By calling it a war you already took a side. Maybe you are just
| ignorant, but that's hardly a good excuse.
| shmatt wrote:
| I suggest everyone listen to the current season of the Serial
| podcast.
|
| >processing masses of data to rapidly identify potential "junior"
| operatives to target. Four of the sources said that, at one stage
| early in the war, Lavender listed as many as 37,000 Palestinian
| men who had been linked by the AI system to Hamas or PIJ.
|
| This is really no different than how the world was working in
| 2001 and choosing who to send to Gitmo and other more secretive
| prisons, or bombing their location
|
| More than anything else it feels like just like in the corporate
| world, the engineers in the army are overselling the AI buzzword
| to do exactly what they were doing before it existed
|
| If you use your paypal account to send money to an account
| identified as ISIS, you're going to get a visit from a 3 letter
| organization really quick. This sounds exactly like that from
| what the users are testifying to. Any decision to bomb or not
| bomb a location wasn't up to the AI, but to humans
| janice1999 wrote:
| > how the world was working in 2001
|
| By the world you mean the US, but yes you are correct.
|
| "NSA targets SIM cards for drone strikes, 'Death by unreliable
| metadata'"
|
| https://www.computerworld.com/article/2475921/whistleblower-...
| shmatt wrote:
| Australia, Canada, Denmark, France, Germany, and Norway were
| heavily involved in the war on terror. Bombing Afghanistan
| but also arresting "suspected" people of their own
| iooi wrote:
| > how the world was working in 2001 and choosing who to send to
| Gitmo
|
| "Gitmo" didn't open until 2002
| smt88 wrote:
| I know many people won't read past the headline, but please try
| to.
|
| This is the second paragraph:
|
| "In addition to talking about their use of the AI system, called
| Lavender, the intelligence sources claim that Israeli military
| officials permitted large numbers of Palestinian civilians to be
| killed, particularly during the early weeks and months of the
| conflict."
| yonisto wrote:
| What can one do when Hamas has embedded them self in the
| civilian population? Why don't they get out and meet the
| Israeli army on the battle field? This is no different than
| chemotherapy, in order got the body to survive some healthy
| cells will die together with cancerous one. It is much better
| than the carpet bombing used by other nations.
| KingMob wrote:
| > What can one do when criminals have embedded them self in
| the civilian population? Why don't they get out and meet the
| police on the battle field?
|
| We wouldn't tolerate a SWAT team blowing up a hospital if the
| mafia had taken over the basement, I have no idea why you
| think this is acceptable.
|
| > It is much better than the carpet bombing used by other
| nations.
|
| It is _exactly_ like the carpet bombing used by other
| nations.
| yonisto wrote:
| > We wouldn't tolerate a SWAT team blowing up a hospital if
| the mafia had taken over the basement, I have no idea why
| you think this is acceptable.
|
| While I agree with comparing Hamas to the mafia, both are
| criminal organizations, Hamas is more than that. It has
| rockets, it mascaraed civilians and holds the ideology of
| genociding its enemy. None of that is applicable to the
| mafia. So if its people are hiding in an hospital and
| refuse to surrender there is no moral objection to blow up
| the hospital (Also, if you are referring to Shifa Hospital,
| Israel didn't blow it, they entered with SWAT teams and
| there were fierce fighting costing also Israeli soldiers
| lives)
|
| > It is exactly like the carpet bombing used by other
| nations. I'll link to Wikipedia to help you spot the
| differences [0]
|
| [0] - https://en.wikipedia.org/wiki/Carpet_bombing
| koutetsu wrote:
| I think that no matter your view on the mafia or Hamas,
| it still doesn't justify the amount of death and
| destruction that is being done in Gaza. No matter how you
| spin it or sugar coat itw killing, displacing and
| starving civilians, killing aid workers and journalists
| and destroying civilian infrastructure are war crimes. As
| for the Al-Shifa Hospital, Israel's SWAT is either
| incompetent or not a SWAT team at all judging by the
| length of the operation, 2 weeks, and the photos of the
| Hospital after they left.
| Workaccount2 wrote:
| But Hamas is a cancer that constantly is trying to
| metastasize into Israel. Seriously, what is Israel
| supposed to do? Anything they label as "Do not attack"
| just becomes an attack vector for Hamas.
| koutetsu wrote:
| I will try to answer that question. I think it's better
| to find the actual reason for what's happening rather
| than focus on the symptoms. Perhaps Israel could stop
| being an apartheid[0,1] and treat Palestinians equally.
| It could also stop imposing a blockade on Gaza[2] and
| allow it to blossom again and remove the need for
| supporting Hamas. It could as well allow Palestinians to
| exercise their right to return to where they or their
| parents lived [3].
|
| It's easy to point the fingers at Hamas for the region's
| suffering but that is dishonest and completely omits the
| big role that Israel played in creating this and previous
| events.
|
| [0] https://www.hrw.org/news/2023/12/05/does-israels-
| treatment-p...
|
| [1] https://www.law.cornell.edu/wex/apartheid
|
| [2] https://www.unicef.org/mena/documents/gaza-strip-
| humanitaria...
|
| [3] https://www.hrw.org/news/2024/01/27/gaza-two-rights-
| return
| stefan_ wrote:
| Because this is war and not a SWAT police operation?
|
| If soldiers in the field have reason to believe the enemy
| is in a building and call in air support to bomb it, no
| part of that is a war crime. Even if someone later goes and
| discovers the people in that building were actually
| preschoolers; what matters is what the people in the field
| making the decisions knew at that moment.
| FireBeyond wrote:
| You realize you're actively advocating for a lack of
| critical thinking and investigation, to maintain
| plausible deniability? What could possibly go wrong?
| ThalesX wrote:
| > what matters is what the people in the field making the
| decisions knew at that moment
|
| This is insane. What matters is the objective truth,
| whether or not dozens of preschoolers were killed due to
| an operational mistake.
| stefan_ wrote:
| War is horrible, what's new? But few things in it qualify
| for war crimes.
| mmustapic wrote:
| Would Israel blow an israeli hospital if Hamas took the
| basement?
| smt88 wrote:
| The whole point of this article (and much of what we've
| learned in the last few months) is that Israel is clearly
| _not_ just targeting areas with suspected Hamas activity.
|
| They're using indiscriminate weapons (so not targeting at
| all!), hitting known UN and humanitarian sites, and killing
| so ruthlessly that they killed _Israeli hostages_ that made
| the mistake of being living humans in front of IDF soldiers.
| kmac_ wrote:
| Do you justify killing civilians? That's disgusting.
| scotty79 wrote:
| Neither Isrealis nor Hamas believe it's their duty to
| prevent civilian Palestinian deaths in this conflict. At
| this point anyone that can do anything to improve the
| situation are the civilians themselves by social distancing
| from Hamas associates by at least the typical blast radius.
| Athough I don't imagine this would be very effective as
| well.
| cellwebb wrote:
| awful take
| random9749832 wrote:
| >This is no different than chemotherapy
|
| Say that to the parents of the aid workers whose vehicle was
| used as a bullseye: https://www.bbc.co.uk/news/world-middle-
| east-68711282
| rowanseymour wrote:
| As bad as this story makes the Israelis sound, it still reads
| like ass-covering to make it sound like they were at least trying
| to kill militants. It's been clear from the start that they've
| been targeting journalists, medical staff and anyone involved in
| aid distribution, with the goal of rendering life in Gaza
| impossible.
| goethes_kind wrote:
| It seems Israel's strategy is to terrorize the Palestinians to
| extinction. Any reaction is a gift because it gives them a
| reason to accelerate their genocide. If they don't react, they
| are fucked anyway, because nobody cares about enough, beyond
| worthless words. And every time the world wakes up for a
| second, and questions one of their acts of terrorism, they will
| have armies of PR agents, everywhere, they will send their
| puppets on TV all over the globe, they will have their armies
| of online commentators gaslighting the (western) world,
| pretending they didn't do nothing on purpose.
| rozap wrote:
| Yea this really seems like more of a weapon of propaganda
| directed at Israelis. If they didn't want people to know about
| it, we probably wouldn't know about it. The fact that we're
| talking about it is probably not an accident, and I guess the
| play here would be to convince Israelis that the army is
| technologically advanced and they know what they're doing, so
| don't question it. But AI or not they were going to commit
| genocide and violate every international humanitarian law on
| the book. But for the people that still believe the genocide is
| justified I think this probably improves the optics.
| mupuff1234 wrote:
| > It's been clear from the start that they've been targeting
| journalists, medical staff and anyone involved in aid
| distribution
|
| I really doubt that's the case, seems more like a "fire first
| if any suspicion at all and ask questions later" policy. If
| there was an intentional policy to kills journalists, aid
| workers and medical staff you'd see a lot more dead.
|
| And you have to be extremely naive or one sided to not realize
| that Hamas does use those type of roles as cover for their
| operations.
|
| Not trying to justify Israel's actions because they are fucked
| up, but based on all the evidence we have you are clearly
| wrong.
| Workaccount2 wrote:
| >And you have to be extremely naive or one sided to not
| realize that Hamas does use those type of roles as cover for
| their operations.
|
| Why would Hamas use anything other than clearly uniformed
| soldiers, marked military vehicles, and civilian distanced
| military installations?
| worddepress wrote:
| South Africa called it.
| 2devnull wrote:
| Probably going to be flame city in this thread, but I think it's
| worth asking: is it possible that even with collateral damage
| (killing women and children because of hallucinations) that AI
| based killing technology is actually more ethical and safer than
| warfare that doesn't use AI. But AI is really just another name
| for math, so maybe it's not a useful conversation. Militaries use
| advanced tech and that's nothing new.
| janice1999 wrote:
| > AI based killing technology is actually more ethical and
| safer than warfare that doesn't use AI
|
| No. It's just a tool. People still configure the parameters and
| ultimately make decisions. Likewise modern missile do not make
| conflicts more or less ethical just because they require
| advanced physics.
| harimau777 wrote:
| The people mentioned in the article say that they spent about
| 20 seconds on each target and basically just rubber stamped
| them. In that case, I don't think people are ultimately
| making the decisions in a traditional sense.
| arp242 wrote:
| Netanyahu has always been saying that they will kill every
| single last Hamas member, no matter the cost.
|
| I mean, is anyone who paid attention surprised by this
| Lavender system? It's doing exactly what they said they
| were doing: kill everyone suspected of Hamas affiliation,
| no matter the cost.
|
| We can have interesting ethical discussions about the AI
| aspect, but I feel that's not really what this is about.
| harimau777 wrote:
| I think that depends on what the alternative is. It seems to me
| that the problem is that there's no way for Israel to wipe out
| Hamas without massive collateral damage. However, instead of
| giving up on wiping out Hamas, they just decided that they are
| OK with the collateral damage.
| r00fus wrote:
| No the AI was the scapegoat for IDF deciding to "target" low-
| level enemies, then bombing them with bunker-buster 2000lb
| bombs that leveled entire buildings and city blocks around
| those targets.
|
| The AI did _something_ , but the IDF used it to justify
| effectively committing a genocide.
| mikrl wrote:
| I think the concern is that the AI is making life or death
| judgements against people. Some may of course be lawful
| combatants under the rules that govern such things, but the
| fact that an AI is drawing these conclusions that humans act on
| is the shocking part.
|
| I doubt an artillery system using machine learning to correct
| its trajectory and get better accuracy would be controversial,
| since the AI in that case is just controlling the path of a
| shell that an operator has determined needs to hit a target
| decided upon by humans.
| yonisto wrote:
| We need to consider what are the other options in that
| situation, my thinking is that due to Hamas being fully
| embedded in the civilian population, the only other
| "reasonable" method is to carpet bomb... After reading the
| article I much prefer the AI method.
| sitkack wrote:
| No. That is genocide and a war crime. Both are war crimes.
| koutetsu wrote:
| Both of these options are war crimes. I think only talking
| about these two options presents a false dichotomy. There are
| many more options that could have been considered. For
| example, Israel could have accepted the hostage swap and then
| picked Hamas operatives slowly but surely given their
| superior military and intelligence. Israel however prefered
| killing lots of civilians as "collateral damage" in order to
| kill a few Hamas operatives and they didn't even manage to
| rescue hostages. The crime lies in the blatant disregard for
| civilian life in Gaza.
| dudeinhawaii wrote:
| This is a bizarre take. I've seen it multiple times, in
| multiple threads now. Somehow your only options are "kill women
| and children" in large amounts or carpet bomb. I feel like
| there are dozens, if not hundreds of other options if anyone
| gave a damn.
|
| Ultimately, it's a calculus of "us vs them" and which lives are
| valued or devalued.
|
| Relatedly, are police justified when they shoot at a house with
| 500 rounds, killing the suspect and their entire family that
| happened to be in the general vicinity? Is the math "one law
| enforcement > n lives as long as one was a (potential) badguy"?
|
| If you wanted to do this with minimal civilian casualties, then
| you bring the ground forces in, block by block, and you clear
| things the old-fashioned way. You take casualties, but those
| are casualties who signed up to be "warfighters".
|
| Now this IS inflamatory: I think we have a lot of warfighters
| and cops who are just plain cowards, that's the mentality. Why
| have a class of trained and armed people who are so afraid of
| dying that they'd rather kill anything and everything in their
| path than potentially be injured or killed?
|
| I thought the ethos of the warfighter and law enforcement was
| "act as a shield, act as a bulwark, save lives, give my life so
| that others may be free, etc etc". Nowadays its "nah I'm not
| going in that school, there's badguys with guns and I might
| die, just stay outside".
|
| That leads to a failure of imagination where somehow "blow up a
| building with innocent people as long as you got your target"
| seems somehow justified because you didn't risk a 'good guy'
| life. Cowardice.
| notduncansmith wrote:
| > "This is unparalleled, in my memory," said one intelligence
| officer who used Lavender, adding that they had more faith in a
| "statistical mechanism" than a grieving soldier. "Everyone there,
| including me, lost people on October 7. The machine did it
| coldly. And that made it easier."
| supposemaybe wrote:
| Is the AI the one deciding to let all the children of Gaza
| starve? I'd like to know how far this death machine goes?
| majikaja wrote:
| That's just everyday citizens
|
| https://edition.cnn.com/2024/03/08/middleeast/gaza-israelis-...
| d--b wrote:
| `public bool isSomehowAssociatedWithHamas() { return true; }`
|
| _AI_
|
| Yeah, yeah guidelines and all.
| stevenwoo wrote:
| It's slightly more complicated a.) looks like male b.) lives
| here c.) send unguided munition if less than 15 or 100 other
| non targets depending upon value of target.
| Mgtyalx wrote:
| @dang Please consider that this is an important and well sourced
| article regarding military use of AI and machine learning and
| shouldn't disappear because some users find it upsetting.
| d--b wrote:
| Should have the ability to turn off comments for these.
| mistermann wrote:
| The goal of that being?
| pc86 wrote:
| HN exists for us to comment on articles. The majority of
| comments are from folks who didn't even read the article (and
| that's fine).
|
| Turning off comments makes as much sense as just posting the
| heading and no link or attribution.
| d--b wrote:
| Well, this post is surely going to get removed because of
| flaming in comments, so, which is better, post with no
| comments, or no post at all?
| pc86 wrote:
| Having civil conversation and banning aggressively those
| who can't be adults?
| xpe wrote:
| > Well, this post is surely going to get removed because
| of flaming in comments
|
| This is one prediction of many possible outcomes.
|
| Independent of the probability of a negative downstream
| outcome:
|
| 1. It is preferable to correct the unwelcome behavior
| itself, not the acceptable events simply preceding it
| (that are non-causal). For example, we denounce when a
| bully punches a kid, not that the kid stood his ground.*
|
| 2. We don't want to create a self-fulfilling prophecy in
| the form of self-censorship.
|
| * I'm not dogmatic on this. There are interesting
| situations with blurry lines. For example, consider
| defensive driving, where it is rational to anticipate
| risky behavior from other drivers and proactively guard
| against it, rather than waiting for an accident to
| happen.
| xpe wrote:
| > so, which is better, post with no comments, or no post
| at all?
|
| The false choice dilemma is dead. Long live the false
| choice dilemma!
| dang wrote:
| I wrote about this here:
| https://news.ycombinator.com/item?id=39920732. If you take a
| look at that and the links there, and still have a question
| that isn't answered, I'd be happy to hear it.
| hunglee2 wrote:
| AI generated kill lists are sadly inevitable. Had hoped we'd get
| a few more years before we'd actually see it being deployed. Lots
| to think about here
| BitwiseFool wrote:
| Such things have been around for at least a decade. It didn't
| start with the same kind of AI that's being talked about
| recently, but there is a large automated scoring component:
| "Targets are often chosen based on metadata."
|
| https://en.wikipedia.org/wiki/Disposition_Matrix
| xdennis wrote:
| I don't know about kill lists, but AI weapons kinda make sense.
|
| No weapons are nice, but if the good guys don't develop AI
| weapons, the bad guys will.
|
| From what I gather, many US engineers are morally opposed to
| them. But if China develops them and gets into a war with the
| US, will Americans be happy to lose knowing that they have the
| moral high ground?
| skidd0 wrote:
| Right, just like if the good guys don't develop a novel
| coronavirus in a lab, the bad guys will and unleash it on the
| world!
|
| Development of tools of death is not a good guy/bad guy
| thing. The "bad guys" think the "good guys" are bad.
|
| I think "killing" is bad, no matter who develops the tools.
| shepherdjerred wrote:
| There are certainly times when killing is justified.
| Defeating the Axis in WW2 is a great example of this.
| mulmen wrote:
| Hundreds of thousands of innocent civilians died in
| strategic bombing campaigns to achieve that outcome. Do
| the ends justify the means?
| shepherdjerred wrote:
| Yes, without shred a doubt or hesitation.
|
| Germany and Japan were killing millions of innocents in
| WW2. Not only that, but those killings were entirely
| unnecessary.
|
| At least with Israel I can give some of the benefit of
| the doubt that their civilian casualties have some
| strategic outcome. You cannot say the same of Germany and
| Japan in WW2.
|
| (please be charitable to the above; there is a lot of
| nuance here; I don't want to explicitly spell it all out.
| look at my other comments if you want to know my views)
| actionfromafar wrote:
| It's just that I fear their strategic outcome will in the
| end become a net negative, for everyone.
| mulmen wrote:
| I believe the Allies could have defeated the Axis with
| less collateral damage.
|
| The ends were admirable. The means are debatable and in
| some cases regrettable.
| hirsin wrote:
| This assumes that AI based weaponry provides value. The case
| in point here is showing that the only value it provides is a
| flimsy justification for civilian casualties. We... Don't
| need more of that in the US, nor would it provide a "good
| guy" any legitimate value.
| atlantic wrote:
| "Good guys" and "bad guys". Where did you learn your ethics,
| the Cartoon Network?
| uxp100 wrote:
| Depending on your definition of AI they've probably been around
| for a while.
|
| This does seem to be a big step more "AI" than previous systems
| I've heard described though.
| throwaway74432 wrote:
| They're great because the accountability for fuckups goes on
| the system, not on the people using the system. "Oops, the
| system had a bug" doesn't kill careers like "Oops, I made a bad
| call."
| krunck wrote:
| AIs that generate kill lists that kill the innocent should
| themselves be put on a kill list.
|
| Edit: And the humans who approved the list should be help
| accountable, of course.
| kjkjadksj wrote:
| Bombing civilians doesn't kill careers. People were promoted
| for what they did during strategic air campaigns.
| shmatt wrote:
| How do you think people are chosen to visit a secret CIA
| prison, or chosen to get a 12 hour interrogation every time
| they enter the US?
| KingMob wrote:
| Can't wait to be killed by drone strike when a GPT hallucinates
| my name.
| diyseguy wrote:
| The new political excuse for genocide: wasn't me, the AI did it.
| mistermann wrote:
| Continuously throw enough plot twists and general stimulation
| at people and they'll never have the time to consider whether
| they're living in a simulation.
| jakupovic wrote:
| Interesting, how do we prove we don't live in a simulation or
| do we care enough to know?
| supposemaybe wrote:
| Or in the words of Shaggy...
|
| "Saw you blowing up the children..."
|
| "It wasn't me."
| tombert wrote:
| Had a minor panic; I got to a final stage of an interview for a
| company called "Lavender AI". They were doing email automation
| stuff, but seeing the noun "Lavender" and "AI" in combination
| with "bombing" made me think that they might have been part of
| something horrible.
|
| ETA:
|
| I wonder if this is going to ruin their SEO...it might be worth a
| rebrand.
| Nemo_bis wrote:
| Bold of you to assume they won't boast about it...
| tombert wrote:
| I'm pretty sure this company is pretty apolitical and would
| like to stay out of the discussion entirely.
| rvcdbn wrote:
| Anyone who knowingly developed this should be tried held
| personally responsible.
| nerfbatplz wrote:
| Already deleted, that was quick.
|
| If we can't trust AI to drive a car, how the hell can we trust it
| to pick who lives and who dies?
| xdennis wrote:
| That's a valid point, but a terrible example because AI cars
| are legal in many places.
| oliwarner wrote:
| And they are illegal [in many places] because we haven't had
| the right conversations. We need to codify solutions to the
| trolley problem so decisions in bad circumstances align with
| what we expect.
| rabite wrote:
| In all fairness, driving a car is a lot more complicated and
| full of dangerous edge cases than dropping objects or shooting
| anyone within a geofence.
| OscarTheGrinch wrote:
| "AI" in this case is probably mostly Oct 6 cell phone
| locations.
|
| It is obvious that Israel has loosened their targeting
| requirements, this story points to their internal
| justifications. The first step in ending this conflict must be
| to reimpose these standards of self restraint.
| Ancapistani wrote:
| > the system makes what are regarded as "errors" in approximately
| 10 percent of cases
|
| This statement means little without knowing the accuracy of a
| human doing the same job.
|
| Without that information this is an indictment of military
| operational procedures, not of AI.
| abvdasker wrote:
| Accepting technological barbarism is a choice. Among engineers
| there should be a broad refusal to work on such systems and a
| blacklist for those who do.
| snird wrote:
| The other option here is carpet bomb a la Drezden, that would
| have resulted in >400,000 casualties at best.
|
| Why is it barbarism? If it makes the war more efficient and
| more targeted, it is preferred.
| talldayo wrote:
| > Why is it barbarism?
|
| Because the civilian death toll far outweighs the militant
| casualties?
| KikoHeit wrote:
| Absolutely not.
|
| In fact, that's the most efficient urban war in history.
| The ratio of civilians to militants is better than any
| other urban war:
|
| https://www.newsweek.com/israel-has-created-new-standard-
| urb...
|
| You are thinking of open field war - like what is happening
| in Ukraine. That's not the case here.
| justin66 wrote:
| Wow, I've seen that a few times. That opinion piece
| really did its job.
| hobs wrote:
| More targeted with half a million dead? Sounds like you
| forgot to take your not crazy pills.
| snird wrote:
| The ~400,000 figure is WITHOUT targeted technology,
| obviously.
|
| Why do you rush to attack?
| justin66 wrote:
| > The other option here is carpet bomb a la Drezden
|
| Right. Because there are always just two options when you're
| designing a strategy.
| kjkjadksj wrote:
| You act like people individually have agency to make
| sweeping changes of how the world works
| arp242 wrote:
| No, the "other option" is to realize that keeping people in
| what is effectively little more than a concentration camp
| _with no hope of perspective or solution_ can only end in
| violence. Especially if you also start shooting the peaceful
| protestors like they did a few years ago. And then the
| government goes in to bed with the most extreme of extreme
| religious Zionists who quite literally support ethnic
| cleansing and murder.
|
| That is not a justification or a moral judgement, it's just a
| fact that this will happen. This is what has always happened
| throughout history. To deny it is to deny reality.
|
| Something Oct-7 shaped was bound to happen. You can't kick
| people in the face for 50 years, give no perspective for
| improvement, kick them harder in the face when they object,
| and expect all of them to forever turn the other cheek and
| have carefully nuanced opinions on the matter. That's just
| not how people work.
|
| Current actions are not just killing Palestinians, it's also
| killing (future) Israeli. A new Oct-7 shaped event is bound
| to happen again if the current course is followed.
|
| None of this is rocket science. None of this is a novel
| insight. People have been saying this for decades (have we
| forgotten the previous events like the intifada, the wide-
| spread protests 5 years ago, etc. etc.)? Some people were
| seemingly born on the morning of Oct 7 or something.
| xenospn wrote:
| According to your logic, Hamas will also inevitably murder
| thousands of Egyptians as well. No?
| arp242 wrote:
| Egypt and Hamas have a rather adversarial relationship,
| But let's not pretend Israel and Egypt are anywhere near
| equivalent.
|
| And Israel controls much of the comings and goings of the
| Rafah crossing, if that's what you're referring to. Egypt
| doesn't want any trouble with Israel, doesn't really like
| Hamas, and is also not really looking forward to a mass
| exodus of impoverished Palestinians as it's already a
| poor and extremely densely populated country with its own
| problems.
|
| Could Egypt do better? I suppose. But it's nowhere near
| equivalent. Egypt is in a near-impossible position.
| tmnvix wrote:
| > The other option here is carpet bomb a la Drezden
|
| As if that is the only other option.
|
| How has Israel succeeded in rescuing hostages so far? With
| the exception of one, the answer is negotiation.
|
| As for the removing Hamas part, could you share an example of
| a terrorist organisation being bombed out of existence?
| treyd wrote:
| It sure would be nice if this industry had the tiniest shred of
| collective consciousness and realized our capacity to exert
| some level of control over what gets built and what doesn't.
| crawfordcomeaux wrote:
| I took computer ethics 101 about 20 years ago (that was the
| only ethics class on my math/cs degree plans). I learned that
| the ethical thing to do when a system kills
| unintentionally/accidentally, you stop it and redesign from
| the ground up from first principles evolved beyond the
| principles used to design the killing version.
|
| This needs to be applied to nation-states & so much more
| we're engineering.
|
| I'd love to see a design methodology grounded in accounting
| for all nondual needs of humans. This idea usually comes with
| complaints of that being an impossible task, without really
| understanding the issue.
| theyinwhy wrote:
| Lots of us are IEEE members bound to ethical standards:
| https://www.ieee.org/about/corporate/governance/p7-8.html
|
| There are also numerous other organisations with such
| standards.
| golergka wrote:
| Not everyone sees the world as you do. Given this article and
| other information I know about this system, I would be honoured
| to work on it and take a significant pay cut, as it actively
| makes the world a better and safer place.
| 20after4 wrote:
| Turns out one of your associates has been identified as a
| terrorist.
|
| So sorry.
| golergka wrote:
| A lot of my associates has already been identified as Jews,
| and that's quite enough to get them horrifically killed.
| bitcharmer wrote:
| Pretty sure these days more people get killed for being
| Palestinians than for being Jews.
| kjkjadksj wrote:
| The people working on these understand the alternative looks
| like a WWII bombing campaign with greater loss of life
| __loam wrote:
| Not to be combative with the response here, but the density
| of destruction in Gaza is on par with the likes of Dresden.
| It's not really exaggeration to say that Gaza is one of the
| most bombed places since Vietnam, and you don't have to take
| anyone's word for it. You can go to companies like Maxor and
| purchase satellite images on the open market and see for
| yourself.
| stevenhuang wrote:
| The ratio of civilian deaths to military combatants seem to
| indicate a different picture:
| https://www.newsweek.com/israel-has-created-new-standard-
| urb...
| sitkack wrote:
| That isn't that calculus that a moral people run. We operate
| in the present, with the tools we have now with compassion.
| Unless you are the people working on these AI targeting
| tools, how do you know what they understand.
| tmnvix wrote:
| And now they know they were wrong?
| 83 wrote:
| Not trying to be flippant, I'm genuinely curious. If everyone
| was as honorable as you and decided to stop working for the
| military industrial complex - do you think China and Russia
| would just sit back and say "That's cool - we didn't want
| Americas land/resources/overseas territories anyway" ?
| solarpunk wrote:
| isn't this somewhat fallacious logic?
| koutetsu wrote:
| This is like a thief saying that they steal something because
| otherwise another thief would steal it. What the parent is
| suggesting is that engineers around the world should agree to
| not make such systems similar to how doctors have the
| hippocratic oath. It may seem naive and can probably never
| prevent such systems from being built but I think it's worth
| a try. We have to collectively agree on systems we should not
| build.
| me_again wrote:
| "zero-error policy" as described here is a remarkable euphemism.
| You might hope that the policy is not to make any errors. In fact
| the policy is not to acknowledge that errors can occur!
| skilled wrote:
| The Guardian has this story on the front page also, they were
| given details about it pre-publishing,
|
| https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai...
|
| And, personally, I think that stories like this are of public
| interest - while I won't ask for it directly, I hope the flag is
| removed and the discussion can happen.
| tsujamin wrote:
| Pretty disappointing that Guardian article also got quickly
| flagged after HN submission
| bowsamic wrote:
| The flagging on this site is pretty crazy recently
| zelphirkalt wrote:
| The feeling I got during the last months here when there are
| articles about this war is, that there are either many
| Israelis here flagging anything that makes Israel look bad,
| or many Americans, who somehow feel allegiance to Israel and
| think that this ever was about a good cause.
|
| On the other hand of course there are also those that jump on
| any claim that makes Israel look bad. Claims of which there
| are many. Of which far too many have become pretty evident.
| Which far too many people do not want to be true and will
| ignore.
|
| So what can one do? I guess keep an open mind and give claims
| a couple of days to be proven or disproven. Only then judge.
| sph wrote:
| The astroturfing from pro-Israel parties on social media is
| incredible, but nothing beats the Reddit situation.
|
| I was just browsing /r/ukpolitics just now, and it is mind
| boggling how many pro-Israel comments come from people that
| apparently are only commenting about that topic. No
| activity whatsoever on popular subs, on hobby subs, but
| instead their entire posting history is composed of months
| and months of tirelessly defending the state of Israel.
|
| Sounds like work, and it seems that many forgot about the
| Mossad-operated propaganda farms that made the news a
| decade ago. Most people are so blind to propaganda that
| these fake personas do not even have to be particularly
| subtle about it.
|
| It would be so easy to identify these paid state actors
| with some simple code, but I do not want to give ammunition
| to those other cretins that would use such a tool to target
| Jewish people as a whole; so I just notice the propaganda
| and move on.
| pvarangot wrote:
| Reddit is a problem only on the popular frontpage subs
| for people with just-reading or no accounts. If you make
| an account and read certain subs their algorithm will
| recommend you the alternate subs that are less
| astroturfed and the discussion is free from people
| telling you that there's evidence and reports that don't
| exist and insult you if you ask for links, or people that
| say that something says something and post you a link and
| then you go and it says something else (usually less
| favorable to Israel). The main subs have been pretty bad
| with Israel-related news since forever, it's not a new
| thing, it's that there's more of those news now.
| sph wrote:
| /r/ukpolitics is not a main sub, yet they operate on
| there as well. It is not very hard to have an alert any
| time anyone posts a topic with the word "Israel" in the
| title, coming to mass downvote anything remotely critical
| of their employer.
|
| Of course on a main sub like /r/worldnews for example,
| the astroturfing there is even more noticeable and
| blatant.
|
| You know what makes it even more obvious? How seemingly
| few Israeli or Jewish people on social media seem to be
| against the current massacre and/or the Netanyahu
| government. Of course there are many in the real world,
| but these dissenting voices are drowned by the massive
| pro-govt propaganda operation.
| jmyeet wrote:
| The generally accepted terminology here is "pro-Zionist"
| and "anti-Zionist". There is a concerted effort to conflate
| "anti-Zionism" and "antisemitism" in public discourse.
| You're not imaginging things. It's part of an organized
| campaign generally called hasbara [1]. Articles or videos
| that don't suit this narrative are brigaded, flagged and
| reported (as you noted).
|
| I say all this because to call it "Israeli" is inaccurate.
| For example, in the US Christian Zionists outnumber Jewish
| Zionists by at least 20:1. Many of those Christian Zionists
| themselves are antisemitic. This is another reason why our
| language here matters and we need to be precise with our
| termminology.
|
| [1]: https://www.newarab.com/news/understanding-hasbara-
| israels-p...
| JeremyNT wrote:
| This is the now-flagged to death HN thread on the Guardian
| version [0]
|
| I would hope they can be unflagged and merged, this appears to
| be an important story about a novel use of technology.
|
| [0] https://news.ycombinator.com/item?id=39917727
| ok123456 wrote:
| It's flagged now, too. It's all so tiring.
| dang wrote:
| Yes. We've merged that thread hither.
| r721 wrote:
| There's another dupe thread:
| https://news.ycombinator.com/item?id=39919109
| dang wrote:
| Thanks, we'll merge that one too.
| tguvot wrote:
| guardian references 972 as source for report. it's not like
| it's "the guardian" article
| dang wrote:
| Yes, that's why we merged those threads into this one. From
| https://news.ycombinator.com/newsguidelines.html: Submitters:
| " _Please submit the original source. If a post reports on
| something found on another site, submit the latter._ " -
| https://news.ycombinator.com/newsguidelines.html
|
| Readers might still find it helpful to read both pieces, of
| course.
| tguvot wrote:
| 972 this is leftist "blog magazine" with questionably
| sourced material. while there might be some truth to core
| claim of automated system (which IDF confirmed that
| exists), rest of claims probably outcome of "broken phone".
| But everybody will use it as statement of undeniable fact
| in order to evolve as usual discussion into "genocidal
| Israel indiscriminately killing civilians in droves and
| performs ethnic cleaning and other countless war crimes"
| and downvote into oblivion everybody who will disagree with
| it.
| albumen wrote:
| What evidence would you accept that would prove the
| allegations sufficiently to change your mind?
| cess11 wrote:
| Mainstream israeli television pretty much agrees with the
| description you put in quotes. It's just that they think
| it's jolly good and something to be proud of, whereas you
| seem to disagree with that?
| FireBeyond wrote:
| It is fairly difficult to be to the right of some of
| Netanyahu's and Likud's positions, so disdainfully
| referring to it as some "leftist blog magazine" is more
| just an attempt to denigrate.
| dist-epoch wrote:
| The Guardian does not easily reference some random source.
| There is some vouching involved, especially for a story like
| this.
| photochemsyn wrote:
| The difference between previously revealed 'Gospel' and this
| 'Lavender' is revealed here:
|
| > "The Lavender machine joins another AI system, "The Gospel,"
| about which information was revealed in a previous investigation
| by +972 and Local Call in November 2023, as well as in the
| Israeli military's own publications. A fundamental difference
| between the two systems is in the definition of the target:
| whereas The Gospel marks buildings and structures that the army
| claims militants operate from, Lavender marks people -- and puts
| them on a kill list."
|
| It's one thing to use these systems to mine data on human
| populations for who might be in the market for a new laptop, so
| they can be targeted with advertisements - it's quite different
| to target people with bombs and drones based on this technology.
| r00fus wrote:
| The link between targeting - whether for advertisements or for
| death - is quite disturbing.
|
| Both use personal metadata, and both can horribly get it wrong.
| Quanttek wrote:
| Years ago, scholars (such as Didier Bigo) have already raised
| concerns about the targeting of individuals merely based on
| (indirect) association with a "terrorist" or "criminal".
| Originally used in the context of surveillance (see Snowden
| revelations), such systems would target anyone who would be e.g.
| less than 3-steps away from an identified individual, thereby
| removing any sense of due process or targeted surveillance. Now,
| such AI systems are being used to actually kill people - instead
| of just surveil.
|
| IHL actually prohibits the killing of persons who are not
| combatants or "fighters" of an armed group. Only those who have
| the "continuous function" to "directly participate in
| hostilities"[1] may be targeted for attack at any time. Everyone
| else is a civilian that can only be directly targeted when and
| for as long as they directly participate in hostilities, such as
| by taking up arms, planning military operations, laying down
| mines, etc.
|
| That is, only members of the armed wing of Hamas (not recruiters,
| weapon manufacturers, propagandists, financiers, ...) can be
| targeted for attack - all the others must be arrested and/or
| tried. Otherwise, the allowed list of targets of civilians gets
| so wide than in any regular war, pretty much any civilian could
| get targeted, such as the bank employee whose company has
| provided loans to the armed forces.
|
| Lavender is so scary because it enables Israel's mass targeting
| of people who are protected against attack by international law,
| providing a flimsy (political but not legal) justification for
| their association with terrorists.
|
| [1]:
| https://www.icrc.org/en/doc/assets/files/other/icrc-002-0990...
| CommieBobDole wrote:
| It's also interesting (and I guess typical for end-users of
| software) how quickly and easily something like this goes from
| "Here's a tool you can use as an information input when
| deciding who to target" to "I dunno, computer says these are
| the people we need to kill, let's get to it".
|
| In the Guardian article, an IDF spokesperson says it exists and
| is only used as the former, and I'm sure that's what was
| intended and maybe even what the higher-ups think, but I
| suspect it's become the latter.
| pixl97 wrote:
| https://en.wikipedia.org/wiki/Computer_says_no
|
| https://en.wikipedia.org/wiki/Computers_Don%27t_Argue
| solarpunk wrote:
| 20 second turnaround from target acquisition to strikes seems
| to guarantee it's become the latter.
| hluska wrote:
| Do you have enough military experience to say this? Or are
| you just guessing?
|
| I'll guarantee that it's the latter.
| JohnKemeny wrote:
| I'm guessing the point they're making is that there's no
| human in the loop, which can confidently be claimed, even
| without military experience.
| hluska wrote:
| It's interesting how when the IDF comes up, the vast majority
| will always interpret their words in the most offensive way
| possible. There's no such thing as a charitable
| interpretation - it's IDF == bad.
|
| It's embarrassing watching people give up on critical
| thought.
| dotnet00 wrote:
| When this latest series of attacks started there was still
| some room to charitably interpret what the IDF had to deal
| with, but we've had months of constant action and very
| obvious suffering and death that the IDF has been imposing
| upon Gaza, either intentionally or through sheer apathy.
| They've long since lost the "oh but think critically"
| excuse. The amount of suffering they are inflicting is not
| at all justified, it has gone far beyond just a tit-for-tat
| retaliation.
| zmgsabst wrote:
| They're not interested in a tit-for-tat retaliation:
| they're intending to destroy the political and military
| structures that made the attack possible. A smaller
| country can't cry "that isn't fair!" when they start a
| fight and get beaten -- this isn't a scuffle between kids
| at school.
|
| We all should have worked harder at solving the problem,
| but a genocidal militant group launching a surprise
| attack after years of feigning peace made such a
| retaliation inevitable.
|
| When people say you're not "thinking critically", they're
| saying you're trying to portray one of the modern
| conflicts with the _lowest_ civilian deaths (versus
| combatants) as a crime against humanity while ignoring
| numerous others -- eg, genocides in Niger or Myanmar, and
| forced expulsions in Armenia /Azerbaijan.
| dotnet00 wrote:
| "We all should have worked harder" is such an absurd
| thing to be saying alongside that sorry excuse you've
| presented.
|
| The entire point of human rights and rules of war is that
| there are certain rights the people of even small
| countries that started the fight are entitled to. You
| don't just get to excuse relentlessly bombing hospitals
| and aid workers. "We thought it was a military target,
| but we will not disclose why, nor will we disclose what
| we're doing to not make this mistake in the future" is
| not a get out of jail free card for genocides, especially
| when it never seems to come with any actual signs of
| improvement.
|
| Campaigns to stop genocides in other places having been
| unsuccessful does not justify smaller genocides taking
| place elsewhere. That's not critical thinking, that's
| whataboutism.
|
| Particularly considering that not only is America's
| supposedly democratic leadership not condemning the
| atrocities, they're actively offering the aid to continue
| it while claiming to want peace.
|
| Being from India, I can relate to the troubles with
| islamic terrorism that Israel has faced, which is why I
| mentioned having initially been sympathetic. But if India
| engaged in this large scale indiscriminate slaughter of
| muslims, it'd have been rendered a pariah on a similar
| tier as Russia. As it stands it's already constantly
| accused of being undemocratic and violating the rights of
| Muslims, despite never having undertaken deliberate,
| remorseless government sanctioned slaughter of this
| scale.
|
| It took far less for the current Indian prime minister to
| be banned from Western nations when he was chief minister
| of a state. All he had to do was fail to stop a much less
| deadly riot and get repeatedly exonerated from
| accusations of wrongdoing by several courts.
| zmgsabst wrote:
| Okay -- what specific rules of war do you believe have
| been broken? ...what specific atrocities?
|
| I was responding to the demand for "tit-for-tat" and
| claims this was unusually brutal; neither of those are
| true.
|
| You're now making different, non--specified claims in
| emotionally charged language. Be specific; think
| critically.
| tmnvix wrote:
| Denial of aid. Collective punishment.
| dotnet00 wrote:
| I've pointed out two things, bombing hospitals and
| bombing aid workers.
|
| There's also targeting children, having no qualms about
| the collateral damage when they bomb houses to get at
| single targets and so on. Using systems like the one
| described in the article to offload further
| responsibility, such that if by some miracle Western
| nations do try to introduce the IDF to the concept of
| accountability, they can just blame the computer and
| promise to do better.
|
| I'm using emotionally charged language because these are
| supposed to be emotional topics. "Critical thinking" on
| its own is just a pathway to justifying extreme inhumane
| cruelty.
| pvaldes wrote:
| > what specific rules of war do you believe have been
| broken?
|
| Basically every single one. We will end much faster if
| you just read the laws.
|
| And this is not "a belief" or a "lets debate for a year
| more if this is or not a genocide while sipping tea and
| killing faster". The ship of good faith has parted many
| months ago.
| lozenge wrote:
| "I have ordered a complete siege on the Gaza Strip. There
| will be no electricity, no food, no fuel, everything is
| closed," - Israel defence minister
| the-smug-one wrote:
| >When people say you're not "thinking critically",
| they're saying you're trying to portray one of the modern
| conflicts with the lowest civilian deaths (versus
| combatants) as a crime against humanity while ignoring
| numerous others -- eg, genocides in Niger or Myanmar, and
| forced expulsions in Armenia/Azerbaijan.
|
| Why is it that this is always mentioned? As if ignorance
| of one crime against humanity makes us incapable of
| criticizing the other? And where exactly are the public
| spokespeople from our governments talking about how any
| of these genocides are justified as the killers have a
| "right to defend themselves"? Not to talk about how an
| attack that killed 4000 people justifies killing 25000
| non-combatants.
|
| > They're not interested in a tit-for-tat retaliation:
| they're intending to destroy the political and military
| structures that made the attack possible. A smaller
| country can't cry "that isn't fair!" when they start a
| fight and get beaten -- this isn't a scuffle between kids
| at school.
|
| This is not even comparable to what is occurring when the
| world is condemning Israel's actions. If Israel was
| interested in removing the political structures that made
| Hamas's attack supported by Gaza then they could've
| stopped the settlement of the west bank, supported the
| stability of the Palestinian state, and countless of
| other actions which would have lowered the risk of
| creating terrorists in Gaza.
| chgs wrote:
| People hold israel to a higher standard as it's a modern
| western democracy and not some tinpot banana republic or
| military junta.
| yawaramin wrote:
| > We all should have worked harder at solving the
| problem, but a genocidal militant group
|
| Yes, 'we all' should have worked harder when Benjamin
| Netanyahu actively funded Hamas and expended all possible
| efforts to prevent a viable Palestinian state. Genius
| thinking right there.
| GuB-42 wrote:
| Ideally, that would be "Computer says we shouldn't kill these
| people, let's not".
| jiggawatts wrote:
| It's a very powerful drug to be able to shrug your
| shoulders and say you were just doing as you were told.
| lozenge wrote:
| "I'm sure that's what was intended"
|
| Intended by who? You don't kill 13,000 children by accident.
| surfingdino wrote:
| It always starts with making a list of targets that meet given
| criteria. Once you have the list its use changes from
| categorisation to demonisation -> surveillance -> denial of
| rights -> deportations -> killing. Early use of computers by
| Germans during WW2 included making and processing of lists of
| people who ought to be sent to concentration camps. The only
| difference today is that we are able to capture more data and
| process it faster at scale.
| Qem wrote:
| There's even books written about it. Shame on IBM for this. I
| suspect in the future we'll have lots of books like this, for
| other companies enabling this genocide:
| https://en.wikipedia.org/wiki/IBM_and_the_Holocaust
| imjonse wrote:
| The same author wrote Nazi Nexus, with separate chapters
| for different US companies' (Ford, GM) dealings with the
| Nazi regime. It can always be a case of "let's not bring
| politics into work" attitude or the belief that "tech is a
| tool only, can be used for good or ill" but at least in the
| years leading up to WW2 there was a lot of support for
| eugenics, antisemitism (Henry Ford was a notorious one) and
| other Nazi tendencies in the US too. I would not be
| surprised if many of those working on killer AI today were
| politically motivated and not just developers caught in
| projects they don't really have their hearts in.
| Sleepful wrote:
| Operation Paperclip et al
| CatWChainsaw wrote:
| In the future, AI will be so good that it will detect
| criticism of IBM as you are typing and threaten to lock you
| out of "your" computer unless you delete your work.
|
| Either that or genAI will be used to publish a bunch of
| books telling fantasy stories about how IBM personally
| arrested Hitler. :)
| ysofunny wrote:
| as it turns out, there's a better way.
|
| already the AI detects criticism of itself. except its
| response it's to shadowban you meaning you can continue
| to post but nobody sees your opinion online.
|
| eventually, you're "bubbled" by AIs.. all your
| interactions online are surrounded by an AI and you'd
| think you're interacting with other people when you're
| just AI-bubbled so to not disrupt the rest of the
| workers.
|
| you'll still see likes, and other interactions with the
| social media posts you leave behind, but as a flagged
| critic of the system, all these interactions are merely
| faked to keep you calm. as the AI advances you'll even
| see responses, retweets and other interactions.... all AI
| driven in order to keep you busy while IBM keeps a calm
| overwatch over all. the end.
| bawolff wrote:
| > That is, only members of the armed wing of Hamas (not
| recruiters, weapon manufacturers,..
|
| I think the loop-hole here is that a weapon manufacturing
| facility is almost certainly a military strategic target, and
| international law allows you to target the infrastructure
| provided the military advantage gained is porportional to the
| civilian death.
|
| So you can't target the individuals but according to
| international law its fine to target the building they are in
| while the individuals are still inside provided its militarily
| worth it.
| shmatt wrote:
| Gitmo is still open, if the US isnt participating in those
| laws, I don't see how any of its allies are expected to
| throwaway7351 wrote:
| By the standards discussed in the article, anyone with a beef
| with Israel could justify targeting possible a majority of
| buildings in Israel. After all, most of the population is
| required to serve in the IDF.
| BurningFrog wrote:
| Both Hamas and Hezbollah are routinely doing exactly that.
| rdtsc wrote:
| > Everyone else is a civilian that can only be directly
| targeted when and for as long as they directly participate in
| hostilities, such as by taking up arms, planning military
| operations, laying down mines, etc.
|
| There is some incredible magic that often happens: as soon as
| anyone is targeted and killed, they immediately transform from
| civilians to "collaborators", "terrorists", "militants" etc. Of
| course everything is classified and restricted to avoid anyone
| snooping around and asking questions.
| skinkestek wrote:
| In Norway it is rather the other way:
|
| We all know (if we stop and think) that a person can be both
| a teacher and a terrorist.
|
| But according to media here almost every victim except top
| Hamas brass seems to be referred to by their whatever else
| they were besides terrorists and the terrorists (or even just
| soldier) part get hushed down.
| notsafetocomm wrote:
| Maybe it's because the overwhelming majority of the people
| being killed are actually just regular people?
| zmgsabst wrote:
| That's always the case.
|
| At 2:1 civilians to combatants, this is an unusually
| _low_ civilian death count.
| lukan wrote:
| Well, depends how exactly you classify people as
| "combatant".
|
| https://www.haaretz.com/israel-news/2024-03-31/ty-
| article-ma...
| Gabriel54 wrote:
| Trying my best to assume this comment in good faith...
| Low compared to _what_? For reference, in the recent war
| in Ukraine (post 2022), there have been approximately
| 11,000 Ukrainian civilians killed and approximately
| 70,000 Ukrainian soldiers killed [1].
|
| [1]
| https://en.wikipedia.org/wiki/Casualties_of_the_Russo-
| Ukrain...
| bawolff wrote:
| Usually you would compare it to other instances of urban
| combat.
|
| E.g. you might compare it to ukrainian battles that took
| place in cities, but you wouldn't compare it to ukrainian
| battles that took place in the middle of nowhere where no
| civilians were.
| https://en.wikipedia.org/wiki/Civilian_casualty_ratio has
| some things to compare against. Part of the problem is it
| is often hard to identify who is a civilian, and often
| different battles will categorize them differently. For
| example, in the iraq war us was accused of significantly
| undercounting civilian casualties. All this makes it hard
| to do direct comparisons.
| zmgsabst wrote:
| A similar anti-terrorist war featuring large amounts of
| urban conflict, eg Iraq (3:1) or Afghanistan (4:1.1) --
| since much of Ukraine is designated armies across open
| fields.
|
| Numbers from:
|
| https://en.wikipedia.org/wiki/Civilian_casualty_ratio
| throwaway6734 wrote:
| Ukraine's troops are uniformed and fighting along a
| front, not trying to blend in with civilians in an urban
| area
| _djo_ wrote:
| Not to get into the debate about that other war, but
| there have almost certainly been many more Ukrainian
| civilians killed than the 11 000 formally confirmed
| deaths. That's just the number that can be properly
| verified, mostly in Ukrainian-held territory, and nobody
| is entirely certain how many have died in the Russian-
| occupied regions. Ukraine claims a much larger number
| have died, including more than 25 000 in Mariupol alone,
| for instance, but that can't be independently verified
| because it's still Russian-held.
| tmnvix wrote:
| This could only be possible if you are assuming all males
| killed are Hamas militants. In other words, absurd.
| jiggawatts wrote:
| On the flip side, in this war many of the Gaza combatants are
| either irregular forces or militants deliberately wearing
| civilian clothing.
|
| So if some guy in a track suit and flip-flops uses an anti
| tank grenade launcher, discards the empty tube, walks away,
| and gets lit up, then the next day the Internet is awash with
| videos of the "IDF murdering a civilian!"
|
| For reference, I think both sides are in the wrong in this
| conflict, and Israel more than Gaza.
|
| However, the Internet is full of armchair international law
| experts that are being played like a fiddle by Hamas'
| propaganda arm.
|
| Speaking of international laws of combat: no protections
| apply to non-uniformed combatants pretending to be civilians.
| None. They can be tortured, executed on the spot, whatever.
|
| If you want protections to apply to you, then wear a uniform
| or never go anywhere near a gun.
| ein0p wrote:
| Children and women do not shoot up tanks
| kQq9oHeAz6wLLS wrote:
| Actually, Hamas is being accused of using child
| soldiers...
| ein0p wrote:
| Doesn't mean it's true. Remember the source. A toddler
| can't even lift an RPG
| bloaf wrote:
| The Palestinians have a well documented history of using
| children in combat. See for example:
|
| https://en.wikipedia.org/wiki/Use_of_child_suicide_bomber
| s_b...
| Avicebron wrote:
| I wonder if we track this sentiment how far back it would
| go, I'd suspect it goes back about as far as there have
| been public complaints about child deaths.
| leptons wrote:
| No, but Hamas uses them as human shields when they launch
| missile attacks on Israel, expecting that Israel won't
| counter-strike because it's a civilian area. Of course
| Israel is going to have to knock out the missile
| launchers, and then Hamas cries crocodile tears that
| Israel killed civilians. This has been going on for some
| time.
|
| Palestinians used to strap bombs to children to suicide
| bomb Israelis. Those are the same people in Hamas today,
| the same ideology.
|
| There is no "free Palestine" without eliminating Hamas.
| As long as Hamas has power, Palestinians will never be
| free, and there will be no peace. If Palestinians have an
| election and elect terrorists again, then nothing will
| change.
| ein0p wrote:
| You are aware you're parroting war propaganda, right? I
| mean sure, this does happen in some cases I'm sure, for
| that matter I have seen the IDF _on video_ use
| Palestinians as human shields. But the entire article is
| about the fact that nobody is even looking if there are
| civilians there before dropping bombs, and 20K+ of women
| and children are now dead as a result.
| andsoitis wrote:
| > women do not shoot up tanks
|
| There's quite a bit of literature, history, statistics on
| women terrorists as well as soldiers.
| singleshot_ wrote:
| While perfidy is a violation of the law of war, summary
| execution is not a generally-acceptable penalty under IHL.
| jibe wrote:
| _That is, only members of the armed wing of Hamas (not
| recruiters, weapon manufacturers, propagandists, financiers,
| ...) can be targeted for attack_
|
| It seems wrong that you can't target weapon manufacturers, can
| you cite a source? Weapon manufacturers contribute to the
| military action, and destroying weapon manufacturers
| contributes to military advantage.
| nickff wrote:
| This is a very 'anti-war' opinion by a lawyer affiliated with
| the Red Cross, not some sort of treaty or other convention.
| As an example, the Geneva Convention's scope of protection is
| much narrower.
| Quanttek wrote:
| While the DPH Guidance has it's controversial parts (Rec
| IX), the guidance on interpreting "directly participating
| in hostilities" is quite authoritative.
|
| And that should be emphasized: the Geneva Conventions allow
| the targeting of military objectives, combatants (i.e.
| members of armed forces) and "civilians directly
| participating in hostilities". The Guidance just interprets
| the latter and arguably widens the scope, because - without
| the invention of "continuous combatant function" - you
| could attack e.g. members of Hamas' armed wing during an
| attack and in preparation of one. Now you can attack them
| at any time.
| Quanttek wrote:
| You can target the manufacturing plants since they are
| military objectives but you cannot target the workers. If any
| war-sustaining activity would make you, as a person, a
| target, pretty much anyone could be bombed: farmers, bankers,
| power plant engineers, truck drivers, ...
|
| For a source, you can check out the Red Cross document I
| linked. Specifically, Ctrl+F for "continuous combat function"
| and read the commentary on recommendation V. The Guidance is
| considered authoritative in legal circles.
| quandrum wrote:
| In the case of Hamas, the US and Israel are the primary
| weapon manufacturer, as unexploded ordinance is the primary
| source of their explosives.
|
| https://www.nationalreview.com/corner/hamas-is-using-
| unexplo...
| _blk wrote:
| Regardless of the merits of lavender, please do take note that
| "Protection by international law" becomes rather slim by its
| own definition when so called "fighters" (that indiscriminately
| shoot badly manufactured rockets civilian population, thus
| better called terrorists or illegal enemy combatants to use a
| once popular legal term) use civilians as shields, hide
| themselves and their weapons in civilian, religious and medical
| institutions. That makes all those targets "legitimate" using
| the international law's own definition. Just saying.
| smashah wrote:
| There's no justification for committing a holocaust.
| Comma2976 wrote:
| >civilians as shields
|
| For as much as the IOF likes to market this expression, I
| only ever see them do it in the actual sense, like chaining
| literal 12 year-olds to the front of armored vehicles
|
| https://www.btselem.org/ota/104/all
|
| https://en.wikipedia.org/wiki/Human_shields_in_the_Israeli%E.
| ..
| mschuster91 wrote:
| > Only those who have the "continuous function" to "directly
| participate in hostilities"[1] may be targeted for attack at
| any time.
|
| The problem with Hamas is that they don't shy away from hiding
| combattants in civilian clothings or use women and children as
| suicide bombers. There is more than enough evidence of this
| tactic, dating back many many years [1].
|
| By not just not preventing, but actively ordering such war
| crimes, Hamas leadership has stripped its civilian population
| of the protections of international law.
|
| > Otherwise, the allowed list of targets of civilians gets so
| wide than in any regular war, pretty much any civilian could
| get targeted, such as the bank employee whose company has
| provided loans to the armed forces.
|
| In regular wars, it's uniformed soldiers against uniformed
| soldiers, away from civilian infrastructure (hospitals,
| schools, residential areas). The rules of war make deviating
| from that a war crime on its own, simply because it places the
| other party in the conflict of either having no chance to wage
| the war or to commit war crimes on their own.
|
| [1]
| https://en.wikipedia.org/wiki/Use_of_child_suicide_bombers_b...
| colordrops wrote:
| > Hamas leadership has stripped its civilian population of
| the protections of international law.
|
| You completely lose any credibility with this statement.
| Civilians can't be "stripped" of protections of international
| law.
| mschuster91 wrote:
| Oh yes they can, that question has been settled in the
| aftermath of the Yugoslavian Wars [1, page 148]:
|
| > 46. The law is thus clear: a hospital becomes a
| legitimate target when used for hostile or harmful acts
| unrelated to its humanitarian function, but the opposing
| party must give warning before it attacks
|
| [1] https://www.icty.org/x/cases/galic/acjug/en/gal-
| acjud061130....
| Thiez wrote:
| An entire civilian population cannot be stripped of its
| protections of international law. This type of dehumanising
| rhetoric is the exact filth that leads to genocide and other
| atrocities (as we can see happening live in recent months).
| firejake308 wrote:
| Practical AI did a podcast episode about the dangers of using
| AI models as a shield to hide behind in justifying your
| decisions. The episode was titled "Suspicion Machines" and
| based on the libked article [1], and I think it's worth a
| read/listen.
|
| [1]: https://www.wired.com/story/welfare-state-algorithms/
| randysalami wrote:
| I wonder how accurate this technology really is or if they care
| so little for the results and instead more for the optics of
| being seen as advanced. On one hand, it's scary to think this
| technology exists but on the other, it might just be a pile of
| junk since the output is so biased. What's even scarier is that
| it's proof that people in power don't care about "correct", they
| care about having a justification to confirm their biases. It's
| always been the case but it's even more damming this extends to
| AI. Previously, you were limited by how many humans can lie but
| now you're limited by how fast your magic black box runs.
| skidd0 wrote:
| I think optics of being advanced aren't the main goal. Some
| form of "justification", no matter how flimsy, especially if
| it's hard to audit how the "AI" came to it's conclusions, is
| the goal. Now anyone is a target. Similar to cops in the US
| "smelling weed" or dogs "signaling". It provides the means to
| justify any search, or in this case, any kill. The machine
| grinds away..
| stevenwoo wrote:
| It's unconfirmed who authorized it but the recent food charity
| workers killed by Israeli bombing had a security person (death
| confirmed by family in UK) who is unarmed but by job
| description clears the way by telling Israeli authorities where
| the charity team is going to be so the chain of command knew
| who they were, so one is naturally lead to ask - who would
| authorize a targeted killing in this situation? The after
| photos show the missile went right through the roof of the car,
| ironically next to the food charity's visible logo on top of
| the car. Israeli defense minister now claims it was a mistake,
| although if they had hit a real target it might have been
| acceptable in terms of their rules of engagement with 15-100
| unrelated collateral deaths according to the investigation.
| nerfbatplz wrote:
| Haaretz reports that the ground troops in Gaza are acting on
| their own and in a state of anarchy.
|
| https://archive.is/2024.04.02-205352/https://www.haaretz.com.
| ..
| stefan_ wrote:
| So like ground troops in every war, ever? There's a whole
| school of thought around having the boots on the ground
| make their own in the moment decisions.
| Qem wrote:
| > There's a whole school of thought around having the
| boots on the ground make their own in the moment
| decisions.
|
| So when the war crimes trial happens the higher ups can
| throw their subordinates under the bus and claim
| ignorance. The Nuremberg defense was about blaming
| superiors. I wonder if the reverse, blaming subordinates
| and computers will be known as Hague defense, after the
| apartheid officers in Tel Aviv are taken to court. See
| https://en.wikipedia.org/wiki/Superior_orders
| shmatt wrote:
| War zones aren't as quiet and organized as you would imagine.
| More so when one side is disguised as regular civilians. All
| war zones also have people killed by friendly fire. I would
| assume friendly fire > killing western charity workers >
| killing civilians in order of importance to the military
|
| Yet still, even that its the most important, friendly fire
| still happens
| matthewdgreen wrote:
| The targeting problems in this war seem much more serious
| than "friendly fire still happens."
| koutetsu wrote:
| I agree with the other commenter that this goes way beyond
| "friendly fire". According to a Haaretz article, those aid
| workers were targeted 3 times in a row and I assume someone
| had to confirm the bombing for all 3 of them. This isn't
| friendly fire. I would love to see their validation data to
| check on their claim of 90% accuracy.
| stevenwoo wrote:
| It's certainly possible for what you write to be true, and
| the video we've seen from other targeted killings indicates
| that even an entire human chain of command could have
| missed the logos on the car, off the top of my head the USA
| example is when we attacked a wedding party in Afghanistan
| because it was close to a combat zone. But it sounds like
| the rules of engagement give IDF the leeway to kill up to
| 15 non combatants in any situation for one AI identified
| male in targeted age group and 100 if the male matches a
| high value target, which seems incredibly broad. It's all a
| moot point for the victims, and the IDF killing hostages
| with their hands in the air sounds like it's kind of out of
| control but could be sampling bias since reporters are
| being killed at a pretty high rate as well.
| jcranmer wrote:
| To quote someone on social media:
|
| > With unintended strikes, there's "we work hard to avoid
| this, but based on bad intel made a rare, tragic error," and
| "we've encouraged RoE that foreseeably makes tragic errors
| frequent, but this looks bad and in hindsight wish we hadn't
| done it."
|
| > Israel's strike on WCK food aid workers is the latter
|
| Israel has long had pretty plain issues with its rules of
| engagement. Recall that earlier in this conflict, the IDF
| shot three of the hostages whose recovery is one of the main
| goals of the operation!
| mistermann wrote:
| "This will get flagged to death in minutes as what happens to all
| mentions of israel atrocities here" (now dead)
|
| It maybe worth noting that there is at least one notification
| service out there to draw attention to such posts. Joel spolsky
| even mentioned such a service that existed back when
| stackoverflow was first being built.
|
| Human coordination is arguably the most powerful force in
| existence, especially when coordinating to do certain things.
|
| Also interesting: it would seem(!) that once an article is
| flagged, it isn't taken down but simply disappears from the
| articles list. This is quite interesting in a wide variety of
| ways if you think about it from a global cause and effect
| perspective, and other perspectives[1]!
|
| Luckily, we can rest assured that all is probably well.
|
| [1] https://plato.stanford.edu/entries/perception-problem/
| tivert wrote:
| The VCs promised a utopia of flying cars and abundance, but all
| we got was more inequality and these AI death machines.
| cthaeh wrote:
| Annd it's gone. This post is deleted from the front page after
| being there for ~20 minutes.
|
| Every. Single. Time.
| bitcharmer wrote:
| HN is the new r/worldnews. Only the correct mindset is allowed.
| irobeth wrote:
| I'm reminded of [1] a recent Palantir promotional video
|
| [1] https://www.youtube.com/watch?v=XEM5qz__HOU
| ulnarkressty wrote:
| As a backer on the original Oculus kickstarter, I have such a
| sinking feeling in my stomach every time this comes up. My
| money went to enable Luckey to achieve this and I hate myself
| for it.
| blackhawkC17 wrote:
| Luckey founded Anduril, not Palantir.
| luketaylor wrote:
| The IDF uses Palantir's technology, and Palantir is outspoken
| about its support for the state of Israel:
|
| https://www.bloomberg.com/news/articles/2024-01-10/palantir-...
|
| https://www.cnbc.com/2024/03/13/palantir-ceo-says-outspoken-...
| oliwarner wrote:
| HN has a serious problem if factual technology stories cannot
| exist here because some people don't like the truth.
|
| This should be advertised. The true price of AI is people using
| computers to make decisions no decent person would. It's not a
| feature, it's a war crime.
| bitcharmer wrote:
| This is not new and dang and the others are absolutely fine
| with posts getting gang-flagged in a matter of minutes. Just
| shows how impartial they are.
| jakupovic wrote:
| Complicit is the word you're looking for.
| dang wrote:
| I've written a lot about how we approach this. If you or
| anyone would like to know more, see
| https://news.ycombinator.com/item?id=39920732 in this thread
| and the links back from there.
| spxneo wrote:
| I'm not sure why its such a shock to many to see the censorship
| on HN. This isn't a public square.
|
| We are privy to the whims of whatever political views of those
| that aligned/run/manage/stake in YC and their policies and
| values.
| oliwarner wrote:
| I'm not shocked, I said it was a problem.
|
| I think it takes a tiny number of flags to nuke a post,
| independent of its upvotes, so strong negative community
| opinions are always quick to kill things.
|
| To restore it, mods have to step in, get involved, pick a
| "side".
|
| I think the flagging criteria needs overhauling so popular,
| flagged posts only get taken down at the behest of a
| moderator. But that does mean divisive topics stay up longer.
|
| For the nothing it's worth, I don't see this post as
| divisive. It's uncovering something ugly and partisan in
| nature, but a debate about whether or not an AI should be
| _allowed_ to make these decisions needn 't be partisan at
| all.
| 2OEH8eoCRo0 wrote:
| Which war crime was committed?
|
| https://www.un.org/en/genocideprevention/war-crimes.shtml
| Workaccount2 wrote:
| Probably most of section e.
|
| But hamas fighters wear civilian clothes, so I'm not sure the
| rules even apply to them.
| ChrisArchitect wrote:
| Related from earlier:
|
| _Israel used AI to identify 37,000 Hamas targets_
|
| https://news.ycombinator.com/item?id=39917727
| wantlotsofcurry wrote:
| Upsetting how quickly the other thread was flagged and
| downranked.
| harimau777 wrote:
| Is there any consequence for inappropriate flagging?
| ykonstant wrote:
| Not in this instance, I assume. People flagging too much can
| result in shadowbanning, but perhaps the mods think that
| flagging posts that _might_ host heated political-religious
| discussion is ok (even if they don 't _have_ such discussion,
| and even if they are on-topic for HN).
|
| I also don't think there is a way to complain about abusing
| flags other than emailing the mods; I have no clue about the
| effectiveness of this complaint.
| segasaturn wrote:
| I don't understand why it was flagged, obviously it is a
| sensitive topic but AI being used to kill people is very
| clearly a HN-worthy topic
| calibas wrote:
| It was flagged because someone doesn't want people seeing
| this.
|
| It's also currently dropping rank on the front page, despite
| being heavily upvoted.
| luketaylor wrote:
| Now removed from the front page even without being labeled
| as flagged.
| nemo44x wrote:
| Yeah, you'd hope that a higher level conversation about the
| use of technology in war, pros/cons, etc could supersede
| personal political beliefs about this particular conflict. We
| don't need people's moral judgements on who is right or wrong
| in this particular case but it would be neat to hear people's
| thoughts on utilizing information technology as a weapon of
| war.
| ilikehurdles wrote:
| One would hope, but I've read all 21 comments in this post
| and not a single one of them meets your criteria.
| dguest wrote:
| Let's see how long it takes this time! I'd give it 50% odds of
| lasting 12 minutes.
|
| Edit: Flagged after less than 9 minutes, I overestimated!
| theEntroX wrote:
| and then un-flagged right after?
| dguest wrote:
| It seems so. What a ride!
| mtlmtlmtlmtl wrote:
| I vouched for it.
| dfxm12 wrote:
| Where's this option?
| mtlmtlmtlmtl wrote:
| Usually the same place the flag button is. It only
| appears when a post or commen is flagged/dead.
| dfxm12 wrote:
| It wasn't there for me. I vaguely remember it being there
| before though.
| mtlmtlmtlmtl wrote:
| I think sometimes you have to click on the date to go
| directly to the post/comment, in order to see it.
| dfxm12 wrote:
| It was flagged in 9, but is now back. Get your comments in
| while you can!
| hn_throwaway_99 wrote:
| As someone who sees both sides of this, and as someone who
| didn't understand this for some time, it's important to
| understand that one reason a story is likely to get flagged is
| because users think _it 's highly unlikely to lead to
| productive discussion_. It doesn't mean it's a bad story, or
| even unworthy of discussion, but many types of stories seem to,
| pretty predictably, lead to a cesspool of comments where it's
| clear most folks have no desire to listen to opposing points of
| view.
|
| FWIW, I found this to be a really interesting story that I
| didn't previously know about, so I hope it stays up, and this
| is a story I'd be willing to vouch for.
| consumer451 wrote:
| There is a system in place for flagging specific comments by
| users.
|
| Admins can, and do, prune entire branches of comments off of
| posts.
|
| These two methods would take a bit more work than just
| banishing the topic entirely, but with topics like the first
| time that "AI" kill lists are publicized, maybe exceptions
| should be made.
| pphysch wrote:
| Successful flagging doesn't (just) disable comments, it
| disables discovery/access.
|
| For a high quality piece of tech-related investigative
| journalism like this, flagging is simply censorship.
| dfxm12 wrote:
| If one don't want to engage, the hide button isn't too far
| from the flag button. It's important that people have the
| option to speak freely and openly about this topic, since so
| many places shut down any conversation that shows sympathy
| for Palestinians and/or doesn't paint Israel as unequivocally
| morally good. This is one of the reasons Israel has been able
| to get away with this behavior for so long.
|
| Considering what regularly doesn't get flagged on this site
| related to AI, conflict, etc., this topic seems to fit in.
| throwaway74432 wrote:
| >it's highly unlikely to lead to productive discussion.
|
| I guess all you have to do, if you want to suppress
| information about something, is to ensure that its comments
| always devolve into unproductive discussions. Funny, I once
| read about this as a tactic for controlling information flow
| in online communities...
| axlee wrote:
| If only we had a word for this behaviour, for example some
| nordic folklore creature ?
| spxneo wrote:
| flagging is voting to censor a particular view. it could have
| legit uses like spam or toxic comments but just as easy to
| censor narratives that isn't aligned or clashes with the
| voter's
|
| im not sure what other tools exist other than a block button
| like X
| GeoAtreides wrote:
| > users think it's highly unlikely to lead to productive
| discussion
|
| I wish people would let people decide for themselves what is
| productive or not...
| hn_throwaway_99 wrote:
| There's always Twitter/X or Reddit if that's your jam. I
| just think it's hard to disagree that a huge, if not
| primary, value people feel they get from HN is the
| discussion, which is probably unmatched compared to any
| open forum on the net, and a huge part of that is
| moderation and curation.
|
| Like I said, I _don 't_ agree with this particular topic
| getting flagged (I saw it go back and forth numerous
| times), but I also would push back hard on any allegations
| of "censorship". There are plenty of completely open forums
| online anyone can access with a click, and HN is most
| decidedly _not_ that, by design, since the beginning of the
| site.
| thomastjeffery wrote:
| I don't take any issue with people flagging a post, so long as
| an actual person makes the ultimate decision on whether to keep
| it up.
|
| This is in contrast to how I feel about a statistical model
| flagging people to be murdered. That's not even remotely OK,
| even if the decision to actually carry out the murder
| ultimately goes through a person. Using a statistical model to
| choose targets is incredibly naive, and practically guarantees
| that perverse incentives will drive decision-making.
| dang wrote:
| This is a typical phenomenon when a topic is divisive, and the
| Israel/Gaza topic is one of the most divisive.
|
| Edit: We sometimes turn off flags when an article contains
| significant new information and also has at least some chance
| of providing a substantive basis for discussion. I haven't read
| the current article yet but it seems like a reasonable
| candidate for this, so I turned off the flags.
|
| For anyone who wants more information about how we approach
| doing that, in the context of the current topic, here are some
| past explanations:
|
| https://news.ycombinator.com/item?id=39618973 (March 2024)
|
| https://news.ycombinator.com/item?id=39435324 (Feb 2024)
|
| https://news.ycombinator.com/item?id=39435024 (Feb 2024)
|
| https://news.ycombinator.com/item?id=39237176 (Feb 2024)
|
| https://news.ycombinator.com/item?id=38947003 (Jan 2024)
|
| https://news.ycombinator.com/item?id=38749162 (Dec 2023)
| edanm wrote:
| > This is a typical phenomenon when a topic is divisive, and
| the Israel/Gaza topic is one of the most divisive.
|
| Kind of related thought - is there a topic you think is
| _more_ divisive? And also, is there some way that this is
| measured officially or unofficially?
| dang wrote:
| No and in fact my comment originally said "the current
| topic is perhaps the most divisive HN has ever seen".
|
| Not measured, though, if you mean some kind of quantitative
| approach.
| edanm wrote:
| I wonder how that could be measured.
|
| I also think it's the most divisive topic here (for the
| last few months at least), but since it's obviously very
| personal for me, it's hard to know if that's a bias in my
| view.
| consumer451 wrote:
| > I wonder how that could be measured.
|
| Maybe posts with high Flag _and_ Vouch counts?
| curiousgal wrote:
| I don't think this topic is divisive anymore. I used to be
| on the fence about the whole conflict despite growing up in
| a Muslim country and being fed propaganda. But nowadays I
| can't in any shape or form rationalize Israel 's actions.
| frob wrote:
| Seeing as these discussions are always insta-flagged and you
| need to revive them to allow for discussion, have you
| considered adding 'Israel' and 'Palestine to a set of
| keywords you need to approve to be set as flagged instead of
| letting automation take over?
|
| Having a human in the loop prevents bad-faith actors from
| abusing the system to suppress information and discussions.
| dang wrote:
| I think we probably already see the most important ones,
| such as the one today. If there's an article that
| particularly deserves having the flags turned off, people
| can always bring it to our attention at hn@ycombinator.com.
| arminiusreturns wrote:
| > "We were not interested in killing [Hamas] operatives only when
| they were in a military building or engaged in a military
| activity," A., an intelligence officer, told +972 and Local Call.
| "On the contrary, the IDF bombed them in homes without
| hesitation, as a first option. It's much easier to bomb a
| family's home. The system is built to look for them in these
| situations."
| tiahura wrote:
| Watching i24 news is a little unsettling. They run bits with
| interrogators announcing how productive torture has been, and
| make jokes about how it would be much easier if lemons just
| gave up their juice without being squeezed.
| A_D_E_P_T wrote:
| > _It's much easier to bomb a family's home._
|
| Okay, how is this not a war crime?
|
| There are ~2M civilians who live in Gaza, and many of them
| don't have access to food, water, medicine, or safe shelter.
| Some of those unfortunates live above, or below, Hamas
| operatives and their families.
|
| "Oh, sorry, lol." "It was unintentional, lmao, seriously." "Our
| doctrine states that we can kill X civilians for every hostile
| operative, so don't worry about it."
|
| The war in Gaza is unlike Ukraine -- where Ukrainian and
| Russian villagers can move away from the front, either towards
| Russia or westwards into Galicia -- and where nobody's
| flattening major population centers. In Gaza, anybody can
| evidently be killed at any time, for any reason or for no
| reason at all. The Israeli "strategy" makes the Ukrainians and
| Russians look like paragons of restraint and civility.
| pphysch wrote:
| The problem isn't determining whether Israel is committing
| warcrimes or genocide, it's the fact that it's a rogue,
| supremacist state that sees itself above international law,
| and is bolstered in that position by its unregistered-
| foreign-agent minions in Washington, a UNSC permanent member.
| mardifoufs wrote:
| Because it's Israel. It's also why no western country has
| ever really officially condemned Israel no matter what they
| do. They are on "our side" so it's okay. And those civilians
| kind of deserved it anyways or something, and we can just
| trust every single word the IDF says and use them as an
| actual source to pretend the IDF isn't into mass civilian
| murder.
|
| The only thing that made this time a bit different is the
| crazy, almost hard to believe, switch from the Ukrainian
| conflict and how it was seen and portrayed... To western
| countries staying completely silent when again, it's our side
| doing it. Well it wasn't hard to believe but it just made it
| a lot more blatant.
|
| Israel doesn't really care though since israeli officers
| routinely go on public tirades that amount to mask-off
| allusions to genocide ("wipe Gaza" "level the city to the
| ground" "make it unliveable"), with again 0 consequences at
| all. Even Russia at least tries to not have Russian military
| officers just say the quiet part out loud.
| cmrdporcupine wrote:
| Even when our own citizens are killed, they don't get
| condemned.
|
| E.g. the IDF targeted and killed a Canadian UN peacekeeper
| in 2008 (because he got too squeaky) and the Canadian gov't
| barely lodged a protest.
|
| https://www.cbc.ca/news/canada/ottawa/un-officer-reported-
| is...
| FdbkHb wrote:
| The sinking of the USS Liberty is the most notable of
| those events.
|
| https://en.wikipedia.org/wiki/USS_Liberty_incident
|
| > The combined air and sea attack killed 34 crew members
| (naval officers, seamen, two marines, and one civilian
| NSA employee), wounded 171 crew members
|
| The only consequence for them was "paying compensations"
| as if there was a price to put on human lives.
| nickff wrote:
| The example you're citing was actually investigated, and
| (IIRC) it was found that Hezbollah was firing mortar(s)
| from a position directly adjacent to the UN post. I
| believe that it was generally assumed that Hezbollah was
| using the Canadians as 'human shields'. Culpability in
| such situations is usually attributed to the shield-
| users, largely due to the consequences of attributing
| blame to the retaliators (i.e. encouraging further use of
| human shields).
| engineer_22 wrote:
| > Okay, how is this not a war crime?
|
| Maybe it is. Maybe it isn't.
|
| Some questions worth asking: what is international law? How
| is international order maintained?
|
| I agree that images and footage from Gaza are disturbing. But
| I encourage you to think systematically about what it is we
| are seeing.
| SalmoShalazar wrote:
| I've thought about it systematically and it appears to be
| genocide
| Workaccount2 wrote:
| The war in Gaza is unlike Ukraine because Hamas does not
| issue uniforms or clearly demarcate military targets.
|
| When the US was in Afghanistan, Al Qaeda learned that the US
| (generally) won't shoot ambulances. So what became the most
| valuable vehicle to Al Qaeda? Hamas took notes, but Israel
| doesn't seem to care as much as the US.
|
| Also, besides all that, once something is used for military
| operations, it is fair game as a military target. Regardless
| of civilians. When the law was written it was assumed that
| governments wouldn't intentionally use their civilians as
| protection.
| sublimefire wrote:
| Isn't a military person a legitimate target at the time of the
| war? I think it is, the issue is the collateral damage. But
| then again this war shows that Hamas is also not following the
| rules and gets too close to civilians.
| supposemaybe wrote:
| Lavender: One person's flower, another person's AI death machine.
| majikaja wrote:
| https://www.aa.com.tr/en/middle-east/israeli-tanks-deliberat...
| mckirk wrote:
| > "You don't want to waste expensive bombs on unimportant people
| -- it's very expensive for the country and there's a shortage [of
| those bombs]"
|
| At that point I had to scroll back up to check whether this was
| just a really twisted April's Fools joke.
| xyzelement wrote:
| What part of this upsets you vs a baseline understanding of
| reality?
|
| There's often a criticism of the US military doctrine that our
| weapons are great but are often way more expensive than the
| thing we shoot them at (as exemplified in our engagement with
| the Houthis in the Red Sea.)
|
| If anything, the quote you pulled sounds like its talking about
| highly precise weaponry, and it seems to me that the way to
| minimize the overall death in a war is to use your precise
| weapons to take out the most impactful enemy.
|
| Which part of this is different than how you see the world so
| that reading this quote threw you?
| jakupovic wrote:
| I'll answer for the previous post. The most disturbing part
| is stating main criteria is being a male and their models
| have 10% error rate.
| xyzelement wrote:
| I don't think you're parsing the article correctly.
|
| There is no allegation that the main criteria for the
| algorithm is "being male."
|
| The allegation is that the human double-checking of the
| algorithm confirms the target is male (as opposed to
| woman/child.)
| jakupovic wrote:
| Not sure what the difference is given the end result?
| anigbrowl wrote:
| Civilians aren't strategic targets like military decision-
| makers, but describing them as 'unimportant' is a sign of
| moral vacuity.
| mckirk wrote:
| I know war isn't pretty, but I really didn't expect that
| openly displayed level of callousness. Saying 'we think these
| people should be dead, but they are not important enough to
| warrant our "good" bombs', to me, says a lot about the
| mentality of the people in charge of that military assault:
| those aren't human lives, those are items on a 'to kill'
| list, and they aren't surrounded by civilians, but
| 'acceptable collateral damage'.
| tokai wrote:
| Its rich when the argument for the system is that the targeting
| is the bottleneck.
| chasd00 wrote:
| >> "You don't want to waste expensive bombs on unimportant
| people -- it's very expensive for the country and there's a
| shortage [of those bombs]"
|
| expensive relative to what? a single rifle bullet? jdam kits
| are not expensive, easy to manufacturer, and there's plenty of
| 500lb dumb bombs lying around. If a country has access to
| precision guided bomb tech then I'd say the should be obligated
| to use it for bombing exclusively.
| throw7 wrote:
| Why is this flagged?
|
| Our premiere AI geniuses were all sqawking to congress about the
| dangers of AI and here we see that "they essentially treated the
| outputs of the AI machine "as if it were a human decision."
|
| Sounds like you want to censor information that could hurt your
| bottomline.
| jessepasley wrote:
| It shows Israel in a bad light.
| 93po wrote:
| HN, both its community and the moderators, flag posts that
| generate a lot of conflict in the comments. The comments on
| this are especially bad by HN standards and therefore the
| flagging is inline with how the site is openly operated.
|
| I am pro Palestine and not simping for Israel. I think
| visibility on Israel's actions matter, but HN is also very
| clearly not the appropriate website for a lot of politically
| involved news.
| __loam wrote:
| I disagree with this, in this issue and more broadly.
| Technology and hacking are inextricably linked to politics,
| whether we like it or not. We cannot separate the effects
| technology has on society and the body politic, and politics
| has an effect on technology through regulatory regimes,
| policy, and the law. These discussions are important to the
| development of technology even if it makes people
| uncomfortable to see views they disagree with, though of
| course there are discussions that are unproductive and should
| not be allowed on this specific forum.
|
| Just as an example, the EU is setting a lot of law and policy
| surrounding technology right now, affecting how companies
| like Apple operate or putting policy into place to regulate
| emerging technologies like AI. The people who make the
| technology should be aware of those policies, how it affects
| what they build, and society's view on the products of their
| development more broadly.
|
| I realize Israel and Palestine is a charged topic, but in my
| view, the high stakes of that conflict and the threat to
| human life on both sides means it's more important to have
| conversations about technology in that context, not less.
| Those conversations are probably going to hurt somebody's
| feelings, but we ought to talk about issues like how freedom
| of speech online and terrorism are connected and how AI
| systems and the military are mixing because it's important to
| maintaining the ethical fabric of our profession.
| dang wrote:
| I wrote about this here:
| https://news.ycombinator.com/item?id=39920732. If you take a
| look at that and the links there, and still have a question
| that isn't answered, I'd be happy to take a crack at it.
| fullstick wrote:
| The name of Lavender makes this so surreal to me for some reason.
| I'm of the opinion that algorithms shouldn't determine who lives
| and dies, but it's so common even outside of war.
| nemo44x wrote:
| I think the algorithm, in this case, makes a suggestion and
| then a human evaluates it. The article claims they've only
| looked at the sex of the target (kill if male) but also claims
| 90% effectiveness. I'm curious if 90% is a good number or not?
| War will always have collateral damages but if technology can
| help limit that beyond what only a human could do then I'd say
| it's a net positive. I think the massive efficiencies the
| algorithm brings to picking targets is a bit frightening
| (nowhere to run or hide now) but there's no real turning back.
|
| People thought this way about the machine gun, the armored
| tank, the atom bomb. But once the genie is out there's no
| putting it back in.
|
| As an aside, I think this is a good example of how humans and
| AI will work together to bring efficiency to whatever tasks
| need to be accomplished. There's a lot of fear of AI taking
| jobs, but I think it was Peter Thiel who said years ago that
| future AI would work side by side humans to accomplish tasks.
| Here we are.
| tokai wrote:
| >During the early stages of the war, the army gave sweeping
| approval for officers to adopt Lavender's kill lists, with no
| requirement to thoroughly check why the machine made those
| choices or to examine the raw intelligence data on which they
| were based. One source stated that human personnel often
| served only as a "rubber stamp" for the machine's decisions,
| adding that, normally, they would personally devote only
| about "20 seconds" to each target before authorizing a
| bombing
| instagib wrote:
| The code names for secret operations can be dead on or funny at
| times. I remember a few being emoji's. It's only a matter of
| time until USA or other allied countries secrets are released
| for using AI enhanced information.
|
| How do you think they process millions of call records,
| intercepted messages, sim swaps, etc?
| FerretFred wrote:
| Next step is for similar AI systems to decide when to start a
| war, or not ...
| tmnvix wrote:
| Or to do away with the concept of starting and stopping wars
| altogether. Just constant AI based justifications for killing.
|
| Wouldn't be surprised if this hasn't already been the case in
| Israel-Palestine already. AI targeting of Palestinians long
| before October 7th in other words.
| supposemaybe wrote:
| My question is:
|
| How far does the AI system go... is it behind the AI decision to
| starve the population of Gaza?
|
| And if it is behind the strategy of starvation as a tool of war,
| is it also behind the decision to kill the aid workers who are
| trying to feed the starving?
|
| How far does the AI system go?
|
| Also, can an AI commit a war crime? Is it any defence to say,
| "The computer did it!" Or "I was just following AI's orders!"
|
| There's so much about this death machine AI I would like to know.
| barbazoo wrote:
| > Also, can an AI commit a war crime? Is it any defence to say,
| "The computer did it!" Or "I was just following AI's orders!"
|
| It's not that the "AI" described here is an autonomous actor.
|
| > During the early stages of the war, the army gave sweeping
| approval for officers to adopt Lavender's kill lists, with no
| requirement to thoroughly check why the machine made those
| choices or to examine the raw intelligence data on which they
| were based. One source stated that human personnel often served
| only as a "rubber stamp" for the machine's decisions, adding
| that, normally, they would personally devote only about "20
| seconds" to each target before authorizing a bombing
|
| Obviously all this is to be taken with a grain of salt, who
| knows if it's even true.
| diggan wrote:
| > How far does the AI system go... is it behind the AI decision
| to starve the population of Gaza?
|
| No, the point of this program seems to be to find targets for
| assassination, removing the human bottleneck. I don't think
| bigger strategic decisions like starving the population of Gaza
| was bottlenecked in the same way as finding/deciding on bombing
| targets is.
|
| > is it also behind the decision to kill the aid workers who
| are trying to feed the starving?
|
| It would seem like this program gives whoever is responsible
| for the actual bombing a list of targets to chose from, so
| supposedly a human was behind that decision but aided by a
| computer. Then it turns out (according to the article at least)
| that the responsible parties mostly rubberstamped those lists
| without further verification.
|
| > can an AI commit a war crime?
|
| No, war crimes are about making individuals responsible for
| their choices, not about making programs responsible for their
| output. At least currently.
|
| The users/makers of the AI surely could be held in violation of
| laws of war though, depending on what they are doing/did.
| dfxm12 wrote:
| _No, the point of this program seems to be to find targets
| for assassination, removing the human bottleneck._
|
| There is also another AI system that tracks when these target
| get home.
|
| _Additional automated systems, including one called "Where's
| Daddy?" also revealed here for the first time, were used
| specifically to track the targeted individuals and carry out
| bombings when they had entered their family's residences._
|
| I think "assassination" colloquially means to pinpoint and
| kill one individual target. I don't mean to say you are
| implying this, but I do want to make it clear to other
| readers that according to the article, they are going for max
| collateral damage, in terms of human life and infrastructure.
|
| _"The only question was, is it possible to attack the
| building in terms of collateral damage? Because we usually
| carried out the attacks with dumb bombs, and that meant
| literally destroying the whole house on top of its occupants.
| But even if an attack is averted, you don't care -- you
| immediately move on to the next target. Because of the
| system, the targets never end. You have another 36,000
| waiting."_
| diggan wrote:
| Yeah, I wasn't 100% sure of using the "assassination"
| wording in my comment, but after thinking about it I felt
| it most neutral approach is to use the same wording they
| use in the article itself, in order to not add my own
| subjective opinion about this whole saga.
|
| > In an unprecedented move, according to two of the
| sources, the army also decided during the first weeks of
| the war that, for every junior Hamas operative that
| Lavender marked, it was permissible to kill up to 15 or 20
| civilians; in the past, the military did not authorize any
| "collateral damage" during assassinations of low-ranking
| militants. The sources added that, in the event that the
| target was a senior Hamas official with the rank of
| battalion or brigade commander, the army on several
| occasions authorized the killing of more than 100 civilians
| in the assassination of a single commander.
|
| I'd agree with you that once you decide it's worth to kill
| 100 civilians for one target, it's really hard to call it
| "assassination" at that point...
| sitkack wrote:
| The system is designed to kill the targets family. This
| is a war crime.
| thomastjeffery wrote:
| > Also, can an AI commit a war crime?
|
| "An AI" doesn't exist. What is being labeled "AI" here is a
| statistical model. A model can't _do_ anything; it can only be
| used to sift data.
|
| No matter where in the chain of actions you put a model, you
| can't offset human responsibility to that model. If you try,
| reasonable people will (hopefully) call you out on your
| bullshit.
|
| > There's so much about this death machine AI I would like to
| know.
|
| The death machine here is Israel's military. That's a group of
| people who don't get to hide behind the facade of "an AI told
| me". It's a group of people who need to be held responsible for
| naively using a statistical model to choose who they murder
| next.
| anjel wrote:
| A rather opinionated site with no about page.
| luketaylor wrote:
| https://mediabiasfactcheck.com/972-magazine/
| dfxm12 wrote:
| https://www.972mag.com/about/
| avtar wrote:
| Another site covering this story:
|
| https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai...
| throwaway240403 wrote:
| Bad UX, it's hiding under the hamburger menu in the top left
| https://web.archive.org/web/20240401100849/https://www.972ma...
| kazmer_ak wrote:
| Turns out, it, too, was just 1000 dudes in India watching camera
| footage and clicking things.
| barbazoo wrote:
| Getting all these reports about atrocities, I wonder if the
| conflict in the area has grown more brutal over the decades or if
| this is just business as usual. I'm in my late 30s, growing up in
| the EU, the conflict in the region was always present. I don't
| remember hearing the kind of stories that come to light these
| days though, indiscriminate killings, food and water being
| targeted, aid workers being killed. I get that it's hard to know
| what's real and what's not and that we live in the age of
| information, but I'm curious how, on a high level, the conflict
| is developing. Does anyone got a good source that deals with
| that?
| dfxm12 wrote:
| Most of the mainstream media has historically glossed over the
| atrocities, but it is impossible to ignore them today because
| of what we see live on the scene thanks to smaller outlets
| having a broader reach and social media.
|
| It's mostly business as usual. The technology makes the
| brutality more efficient, though:
|
| _Describing human personnel as a "bottleneck" that limits the
| army's capacity during a military operation, the commander
| laments: "We [humans] cannot process so much information. It
| doesn't matter how many people you have tasked to produce
| targets during the war -- you still cannot produce enough
| targets per day."_
|
| ...
|
| _By adding a name from the Lavender-generated lists to the
| Where's Daddy? home tracking system, A. explained, the marked
| person would be placed under ongoing surveillance, and could be
| attacked as soon as they set foot in their home, collapsing the
| house on everyone inside.
|
| "Let's say you calculate [that there is one] Hamas [operative]
| plus 10 [civilians in the house]," A. said. "Usually, these 10
| will be women and children. So absurdly, it turns out that most
| of the people you killed were women and children."_
|
| Using Google search, you can search new articles in previous
| years. You'll find older articles about Israel killing aid
| workers, for example. This is from 2018:
| https://www.theguardian.com/global-development/2018/aug/24/i...
|
| The interesting thing about how this conflict is developing is
| that this story is full of quotes from Israeli intelligence.
| Most plainly say what they're doing. Western outlets may put a
| positive spin on it (because our governments generally support
| Israel), but the Israeli military themselves are making their
| intentions clear: https://news.yahoo.com/israeli-minister-
| admits-military-carr...
| xk_id wrote:
| The weaponisation of online media for manipulating the
| perception of global audiences about the conflict, has
| definitely ramped up recently. For example, the official
| Twitter account of Israel's Ministry of Foreign Affairs has
| posted videos of muslim preachers appearing to denounce lgbt
| culture during public service in Palestinian mosques. Hamas
| themselves are denying their involvement in the 2023 massacre
| and accusing Israel of staging the graphic footage that was
| disseminated. This greatly polarises the debates on social
| media and it's much more common now to see people who are
| deeply invested emotionally in the narrative of either side.
| kjkjadksj wrote:
| When the US dropped napalm indecriminately over the vietnamese
| jungle or absolutely leveled dresden in one bombing run or
| unleashed nuclear hellfire over japan, they probably killed a
| lot of journalists and doctors and food workers as well.
| Interestingly, western media did not beat itself into a frenzy
| over it at the time. Its easy to get cynical about it all
| seeing how easily narratives are manufactured and controlled to
| serve political ends.
| segasaturn wrote:
| > Interestingly, western media did not beat itself into a
| frenzy over it at the time
|
| Western mainstream media has been very passive when covering
| the current situation in gaza, especially when you contrast
| it with how they covered the war in ukraine just 2 yrs ago.
| Its just that social media has allowed people to break
| through the canned media narratives.
| barbazoo wrote:
| > Western mainstream media has been very passive when
| covering the current situation in gaza
|
| Just FYI, all the examples I mentioned I read on our public
| broadcaster's website.
| anigbrowl wrote:
| Media coverage of the Vietnam war was one of the decisive
| factors in the eventual US Withdrawal, and was a key part of
| the NVA's strategy.
|
| WW2 was a considerably different war in scope, origin, and
| patterns of escalation.
| archagon wrote:
| I believe the bombing of Dresden was controversial and
| elicited pushback in the media, though it's not surprising
| that reactions may have been muted given the apocalyptic
| nature of the war.
|
| The use of napalm in Vietnam triggered widespread protests.
| realusername wrote:
| > Interestingly, western media did not beat itself into a
| frenzy over it at the time
|
| They did and the newspaper coverage is the main reason why
| the Vietnam war stopped.
| tokai wrote:
| >While humans select these features at first, the commander
| continues, over time the machine will come to identify features
| on its own. This, he says, can enable militaries to create "tens
| of thousands of targets,"
|
| So overfitting or hallucinations as a feature. Scary.
| NickC25 wrote:
| This shouldn't be flagged.
| bitcharmer wrote:
| Given how gang-flagging as a form of censorship became
| prevalent here on HN I think they should consider removing
| flagging functionality for submissions entirely. It should of
| course stay for comments but posts typically get flagged for
| political reasons and nothing else.
| goethes_kind wrote:
| Israel's evil keeps taking me by surprise. I guess when people go
| down the path of dehumanization there are truly no limits to what
| they are ready to do.
|
| But what is even sadder is that the supposedly morally superior
| western world is entirely bribed and blackmailed to stand behind
| Israel. And then you have countries like Germany where you get
| thrown in jail for being upset at Israel.
| HDThoreaun wrote:
| It's been pretty clear to me for a while now that Israel's long
| term plan for the Palestinians is to expel them all. Starvation
| isnt a requirement for that, but it is probably the path of
| least resistance. I will say that its happening a lot faster
| than I expected though, Israel definitely taking advantage of
| the situation here.
| KingMob wrote:
| But in lieu of expulsion, it seems they're ok with starvation
| and mass murder as alternatives.
| jhallenworld wrote:
| I'm in the camp that thinks the two-state solution is
| dead.. which means we are left with an eventual one state
| solution. Which means they are killing their own future
| voters.
| NickC25 wrote:
| They aren't trying to expel them all.
|
| They want to ethnically cleanse the region of them.
| edanm wrote:
| That... means the same thing.
| lupusreal wrote:
| Then let's not mince words. Ethnic cleansing shouldn't be
| softballed.
| NickC25 wrote:
| It sort of does. I think, however, Israel would love it
| if all the Palestinians just buggered off to Egypt or
| Lebanon or some other MENA country.
|
| Sadly, since they haven't, Israel has decided that
| killing them is the only logical proceeding course of
| action.
| sam_goody wrote:
| Actually they are very happy to live with their Arab
| neighbors, and even invite them into their homes and lives.
|
| You can find plenty of pro Palestinian speeches and
| sentiments from those who chose to live in the South, who
| were then murdered by the people they supported.
|
| Many (most?) of those who committed the attacks on Oct. 7
| were working in Israeli houses and factories, and they
| proceeded to kill their employers and co-workers.
|
| Unfortunately, the news is selectively reported, and
| nonsense from Hamas is reported as truths, and there is a
| HEAVY slant against Israel. But no need to parrot stuff
| like this which is openly against the facts.
| scotty79 wrote:
| > It's been pretty clear to me for a while now that Israel's
| long term plan for the Palestinians is to expel them all.
|
| Or kill them. Unsurprisingly expel or kill is exactly the
| plan Palestinians have for Jews.
| gryzzly wrote:
| what do you mean "bribed and blackmailed"?
| luketaylor wrote:
| On AIPAC in the US:
|
| 1. "How the Israel lobby moved to quash rising dissent in
| Congress against Israel's apartheid regime"
|
| 2. "Top Pro-Israel Group Offered Ocasio-Cortez $100,000
| Campaign Cash"
|
| 3. "Senate Candidate in Michigan Says He Was Offered $20
| Million to Challenge Tlaib"
|
| [1]: https://theintercept.com/2023/11/27/israel-democrats-
| aipac-b...
|
| [2]: https://www.huffpost.com/entry/ocasio-cortez-aipac-
| offer-con...
|
| [3]: https://www.nytimes.com/2023/11/22/us/politics/hill-
| harper-r...
| gryzzly wrote:
| hm, and how do you feel about Qatar sponsoring higher
| education in the US? https://en.wikipedia.org/wiki/Qatari_i
| nvolvement_in_higher_e...
|
| Not sure these three links show that "supposedly morally
| superior western world is entirely bribed and blackmailed".
| Especially on the "entirely" and "blackmail" parts.
| hikingsimulator wrote:
| > hm, and how do you feel about Qatar sponsoring higher
| education in the US?
|
| Focusing on international interference by one state does
| not reduce the blame that can be thrown at another.
| There's no limited reserve of blame that requires to be
| cleverly distributed. The undemocratic influence over
| public institutions by lobbies, like Qatar's (see
| Qatargate in Europe) or Israeli-linked ones alike and
| many more, are the death of our societies.
| Adverblessly wrote:
| Surely if Israel is bribing in one direction and Qatar is
| bribing in the other direction, someone is not getting
| their money's worth? That is, the final result is either
| that the "western world is entirely bribed and
| blackmailed to stand behind Israel" or that they don't
| stand behind Israel.
| hikingsimulator wrote:
| That's a dumb argument. It's not like Qatari money is
| trying to buy the mathematical inverse of Israeli money
| in a game of tit-for-tat.
| pphysch wrote:
| Foreign influence from Qatar is another serious case, but
| still small fries compared to malign foreign influence
| from Israel.
| ben_w wrote:
| > And then you have countries like Germany where you get thrown
| in jail for being upset at Israel.
|
| Back in 2002 or so, a friend of mine swore blind that an
| American had been arrested for wearing a "give whirled peas a
| chance" T-shirt -- which is an anecdotal way of saying: are you
| sure you've got the full story?
|
| I'm learning German by listening to ,,Langsam Gesprochene
| Nachrichten" by Deutsche Welle, and it definitely looks like a
| lot of people are less than enthusiastic about how Israel's
| forces are conducting themselves in war _despite_ the constant
| note that Hamas is (1) a terror organisation that (2) started
| this particular round by killing 1000 civilians:
| https://www.dw.com/en/israel-withdraws-from-gazas-devastated...
|
| Germany is also _extremely_ sensitive to every aspect of this
| due to the events of 80 years ago.
|
| Reports I've seen from the BBC show that there are significant
| protests _in Israel_ , by those who consider the war to be
| justified, against their own government, not only for dropping
| the ball by failing to prevent the initial attack, but also for
| driving a wedge between them and their closest allies with the
| conduct of the war: https://www.bbc.com/news/world-middle-
| east-68722308
| segasaturn wrote:
| https://en.wikipedia.org/wiki/Israel%E2%80%93Hamas_war_prote.
| ..
|
| >In Berlin, authorities banned a pro-Palestinian rally from
| being held.[176] A number of spontaneous demonstrations
| protesting the bombing of Gaza took place across the country,
| but were forcefully broken up by police.[177] Germany banned
| fundraising, the displaying of the Palestinian flag and the
| wearing of the keffiyeh.[13]
|
| >In Neukolln, a neighborhood of Berlin, pro-Palestinian
| protesters described police crackdowns on protest that were
| "shocking and violent".[180]
| ben_w wrote:
| [176]: "On Wednesday, Germany's capital Berlin banned a
| pro-Palestinian rally due to several previous demos
| spreading antisemitic hatred."
|
| [177]: "Police broke up the protest by force stating that,
| according to a police spokesperson, public safety was
| threatened by "anti-Israel and violence-glorifying chants"
| and the wearing of masks."
|
| [13]: ""Hamas is already labelled as a terrorist
| organisation in Germany, but now Berlin will prohibit any
| activities in support of the group or its agenda," Scholz
| said in a speech to parliament. The ban will apply to
| fundraising, the display of the Palestinian flag, and even
| the wearing of the Palestinian keffiyeh."
|
| [180]: that one does sound bad even in the source material,
| I'm not going to attempt to delve deeper into that and
| instead will take it at face value.
| lupusreal wrote:
| > _But what is even sadder is that the supposedly morally
| superior western world is entirely bribed and blackmailed to
| stand behind Israel._
|
| Add religious indoctrination to that. A huge number of
| Americans are evangelical Christians who _unconditionally_
| support Israel because they are utterly convinced that the
| continued existence of Israel is a necessary prerequisite for
| the reincarnation of their god.
| jcranmer wrote:
| There is something like a generational divide going on here.
| Much of the older generation remembers the wider Israeli-Arab
| conflict (ongoing since 1948, and arguably even decades before
| that) as "Israel's neighbors repeatedly invade it to try to
| wipe it off the map." But the last such war was 1973; even the
| Second Intifada ended in 2005. For the younger generation, the
| conflict is largely "Israel repeatedly invades its neighbors to
| tamp down on terrorism." In other words, Israel has largely
| shifted from being the aggressee to the aggressor in the
| conflict, and sympathy naturally tends to lie with the
| aggressee.
|
| That said, there's also something noticeably different about
| this conflict. For the first time, the reporting I've seen in
| the mainstream press has generally been trending negative
| towards Israel. For example, the Washington Post has had a
| recent article on a press tour the IDF led of the burned-out
| remains of the hospital it attacked, clearly part of a campaign
| to justify why it was necessary, and the entire article was
| dripping with subtext of "we don't buy what the IDF is saying".
| And even the political headlines are generally framed in a way
| to keep you asking "should the US even be supporting Israel?"
|
| Israel has already squandered all the sympathy it got from the
| terrorist attacks last October, and it's well on the way to
| squandering all residual sympathy from the Holocaust. And the
| Israeli political and military establishment seems to have zero
| clue that this is going on.
| nickpsecurity wrote:
| That's not true. Within a short time of forming, all the
| surrounding nations attacked Israel to ensure they wouldn't
| exist there. Israel's opponents regularly targeted civilians
| with indiscriminate bombings since that's what their morals
| produce. They planned to keep doing that over time, too. Keep
| that in mind when interpreting everything else.
|
| At times, Israel allowed for a two-state solution but Hamas
| wanted every Jew there dead or gone. They'd push them into the
| ocean itself if allowed. People called for Israel reducing
| their presence in Gaza for peace. Doing that led to more
| attacks instead of more peace.
|
| Recently, Hamas killed and kidnapped civilians on purpose.
| Whereas, Israel warned people to leave before the invasion
| where they then focused on military targets. If people stayed
| and were connected to those, they'll likely die during the
| invasion. The OP is about people who stayed that are mostly
| connected to militants. OP writer pities their families but not
| all the non-militant families Hamas killed.
|
| While both sides are plenty guilty, one is actually aiming for
| peace, focusing on military targets, and reducing civilian
| casualties. The other broke peace, attacked civilians, and
| called for more genocide. The difference between these two
| strategies shows that anyone wanting long-term stability with
| less murder in the area should support Israel.
|
| Also, Israel is allied more with us while their opponents keep
| funding terrorist groups, including our own enemies. They're
| also strong, economic partners. Why on earth would we ditch our
| friends to back people who do little for us and support our
| enemies?
| realo wrote:
| How is this not a genocide?
|
| How are those "acceptable" collateral deaths not war crimes?
| Stevvo wrote:
| It is and they are.
| stale2002 wrote:
| To actually answer your question, it is because the word
| "genocide" has a very specific meaning that is different from
| "They did something bad".
|
| You can think that what they are doing is bad, but thats
| unrelated to the highly specific claim of genocide, which
| requires specific intent.
| algem wrote:
| this is a horrific use of ai
| jarenmf wrote:
| Damn, some people really don't want anyone to see this
| jauntywundrkind wrote:
| So frustrating how easy it is for those of a certain zeal to
| wipe off mention of that which they find inconvenient.
|
| There could hardly be a more pertinent issue for tech right
| now. Just sweepingly wild shit that we should be grappling
| with.
| ein0p wrote:
| They unironically named one of the systems used to kill people
| there "Where's daddy?" These are the psychopaths we send billions
| of dollars in military aid to? Wtf?
| random9749832 wrote:
| Don't worry a lot of us are taking notes.
| yboris wrote:
| PSA: https://www.stopkillerrobots.org/
| binarymax wrote:
| I really want to support this, but the website is pretty bad.
| Blinding colors, poor and sparse information, and a links to
| shop/donate without a notion as to what or who the org is.
| mathandstuff wrote:
| Actually, The Campaign to Stop Killer Robots fired its campaign
| manager Ousman Noor as a result of him advocating against the
| IDF's killings in Gaza. The Campaign initially denied that it
| was over his Gaza advocacy, but eventually admitted that it was
| because of him speaking to diplomats which he met through the
| Campaign. Many members of the campaign support the IDF's
| arguable genocide, despite how surprising that might be.
| dhanna wrote:
| The use of these AI systems are the biggest evidence of the
| Genocidal rules of engagement from the Israelis.
| aaomidi wrote:
| I wonder if the WCK assassinations were related to this.
| rich_sasha wrote:
| I don't like anything about this war, but in a way, I think
| concerns of AI in warfare are, at this stage, overblown. I'm more
| concerned about the humans doing the shooting.
|
| Let's face it, in any war, civilians are really screwed. It's
| true here, it was true in Afghanistan or Vietnam or WWII. They
| get shot at, they get bombed, by accident or not, they get
| displaced. Milosevic in Serbia didn't need an AI to commit
| genocide.
|
| The real issue to me is what the belligerents are OK with. If
| they are ok killing people on flimsy intelligence, I don't see
| much difference between perfunctory human analysis and a crappy
| AI. Are we saying that somehow Hamas gets some brownie points for
| _not_ using an AI?
| tech_ken wrote:
| I like this point, and I do think you're rightly pointing out
| that the issue is that selection of targets may be done badly,
| not that AI specifically is in the loop. With that said, I
| think an important detail you're overlooking is the
| frictionless-ness of this process. That quote people are
| throwing around about something like "efficiently producing the
| largest volume of human targets" gets to this point pretty
| directly I think. The problem is not just that the evidence
| might be flimsy, it's also that it's extremely easy to generate
| massive lists of targets.
|
| Instead of the Milosevic example I'd say it's analagous to
| Dehomag machines during the Holocaust. The Nazis didn't _need_
| advanced database systems to attempt a genocide, but having
| access to them made it far far easier to turn the whole process
| into a factory line: something predictable and constant that
| allowed it to achieve a pace and scope far beyond what they
| would have been able to do otherwise. Similar here, or in other
| cases where advanced technology is brought to bear in war.
| Anything that makes human death more automated is, IMO,
| abhorrent and worth of criticism in it 's own right.
| rich_sasha wrote:
| I agree making something bad easier is bad too. But does AI
| make the bad thing easier here?
|
| I see two cases here. One is that the AI has some non-
| negligible accuracy, and one where it doesn't. If it's
| somewhat accurate, then actually, using it is saving civilian
| lives, attacking only the active enemy.
|
| And if it's inaccurate... Then presumably whoever made it
| knows it, and whoever uses it knows it's merely a fig leaf
| for shooting random people, and is ok with that. Is it then
| worse to kill random people as found by an AI than to drop a
| bomb somewhere, because you have a hunch there might be a
| worthwhile target there? This is the bit I'm not sure of.
|
| In this war, it's so easy to find the other side. If you want
| to recklessly shoot civilians, they are just on the other
| side of the wall. I'm not sure that AI makes it any easier.
| majikaja wrote:
| Will America fight on Israel's bidding if it starts a war with
| Iran? Thus opening a new front with the war against Russia
| gregw134 wrote:
| Risk any American soldiers? Definitely not. Support with drone
| strikes, sanctions, intelligence? Already doing that.
| contemporary343 wrote:
| I'm really not sure why this got flagged. It seemed like a well
| sourced and technology-focused article. Independent of this
| particular conflict, such automated decision making has long been
| viewed as inevitable. If even a small fraction of what is being
| reported is accurate it is extraordinarily disturbing.
| dang wrote:
| I wrote about this here:
| https://news.ycombinator.com/item?id=39920732. If you take a
| look at that and the links there, and still have a question
| that isn't answered, I'd be happy to take a crack at it.
| nahuel0x wrote:
| Using the latest advances in technology and computing to plan and
| execute an ethnic cleansing and genocide? Sounds familiar? If
| not, check "IBM and the Holocaust".
| Stevvo wrote:
| First time I've really felt like I'm living in a dystopian
| science fiction.
| giantg2 wrote:
| "Lavender learns to identify characteristics of known Hamas and
| PIJ operatives, whose information was fed to the machine as
| training data, and then to locate these same characteristics --
| also called "features" -- among the general population, the
| sources explained. An individual found to have several different
| incriminating features will reach a high rating, and thus
| automatically becomes a potential target for assassination."
|
| Hamas combatants like fried chicken, beer, and women. I also like
| these things. I can't possibly see anything wrong with this
| system...
| amarcheschi wrote:
| This literally looks like any aborrhent ai "predicting" system
| such as the ones we've heard a ton about in the past, with the
| same mistakes (I wonder if they're really mistakes, bugs, or
| ahem... Features)
| skilled wrote:
| I am more curious about the "compute" of an AI system like this.
| It must be extremely complicated to do real-time video feed
| auditing and classification of targets, etc.
|
| How is this even possible to do without having the system make a
| lot of mistakes? As much AI talk there is on HN these days, I
| would have recalled an article that talks about this kind of
| military-grade capability.
|
| Are there any resources I can look at, and maybe someone here can
| talk about it from experience.
| Mountain_Skies wrote:
| Maybe it's like Amazon's cashierless stores that turned out to
| be mostly powered by 1000 humans working behind the scenes.
| resource_waste wrote:
| I'm probably pro-isreal because I'm a realpolitik American that
| wants America's best interest. (But I'm not strong either way)
|
| Just watched someone get their post deleted for criticizing
| Israel's online PR/astroturfing.
|
| Israel's ability to shape online discussion has left a bad taste
| in my mouth. Trust is insanely low, I think the US should get a
| real military base in Israel in exchange for our effort. If the
| US gets nothing for their support, I'd be disgusted.
| wara23arish wrote:
| Im curious, if you're realpolitik american.
|
| Can you explain why would the USA support one country instead
| of appeasing 300 million in the area?
|
| What are the benefits out of being so pro israel?
| emchammer wrote:
| It is the promised land of the Bible (Torah), where there
| used to be The Temple to THE God. As for all the details
| arising from that, that's the realpolitik.
| xenospn wrote:
| They do both.
| dang wrote:
| Posts don't get deleted on HN, except on rare occasions when
| the author asks us to delete something (and usually then only
| if they didn't get replies).
|
| Posts do get flagged and/or killed, whether by user flags,
| software, or mods, but you can always see all of those if you
| turn 'showdead' on in your profile. This is in the FAQ:
| https://news.ycombinator.com/newsfaq.html.
|
| If you notice a post getting flagged and/or killed that
| shouldn't have been, you can let us know and we'll take a look.
| You can also use the 'vouch' feature, also described in
| https://news.ycombinator.com/newsfaq.html.
| spxneo wrote:
| The most disturbing part for me (going beyond Israel/Palestine
| conflict) is that modern war is scary:
|
| - Weaponized financial trojan horses like crypto
|
| - Weaponized chemical warfare through addictions
|
| - Drone swarm attacks in Ukraine
|
| - AI social-media engineered outrage to change publics perception
|
| - Impartial, jingoistic mainstream war propaganda
|
| - Censorship and manipulation of neutral views as immoral
|
| - Weaponized AI software
|
| Looks like a major escalation towards a total war of sorts.
| surfingdino wrote:
| War has always been scary. We are busy inventing new ways of
| killing each other and there is no sign of stopping.
| bawolff wrote:
| I'm sorry, you think this is new?
|
| War is terrible. War has always been terrible. It was almost
| certainly worse in the past, but it still sucks now. Most of
| the things you mention were way worse 100 years ago.
|
| Sure, AI didn't write the propaganda, instead humans did. The
| affect was the same.
| skilled wrote:
| The world has been at a perpetual war for forever! That is
| actually quite interesting in of itself.
|
| There has been no mass self-correction to my knowledge that
| would avert this kind of destructive behavior.
|
| But in saying that, I am fully aware that most of such behavior
| stems from people who are in charge of the world at a political
| level.
|
| Is it implausible to think that this is something that will
| have to change in order for the world to change?
|
| The war doesn't serve anyone but a few rotten minds who are
| trying to make decisions on behalf of millions if not billions
| of people.
|
| And we share a similar nudge. I do think that was is happening
| in the world today is a mere preparation (of society) for a
| massive power struggle in various parts of the world that will
| inevitably lead to a full-blown war. But this is only my
| personal feeling/interpretation.
| cgh wrote:
| Judged by number of war-related deaths per capita, we are
| living in the most peaceful time in human history. The last
| major conflict was the Second Congo War in the '90s, which
| killed around 5.4 million people and involved a bunch of
| African nations. If you want to talk about scary wars, try
| reading about that one.
|
| I realize this seems almost unrealistically upbeat, and most
| people don't want to believe it given what we see in the media
| every day. Note that I'm not arguing against increasing global
| instability, which will become worse if Russia triumphs in
| Ukraine (whatever form that could take) or the US continues to
| turn its back on its allies.
|
| Disinformation and AI fakery via social media are probably the
| scariest things to me on your list. Twitter is now a garbage
| dump for this stuff, but the good news is that it is
| hemorrhaging both users and money.
| binary132 wrote:
| I don't see magnitude of mortality as necessarily a good
| indicator for the prevalence of violence or "peace".
|
| Let's say, for the sake of the thought experiment, that every
| weekday, a small swarm of killer drones is released in your
| city. These drones reliably, randomly target and kill 250
| commuters per weekday.
|
| That's only 62,500 people per year. Pretty mild. Certainly
| nowhere near as bad as Covid, maybe about as bad as a bad flu
| year, right? Heart disease kills about 700,000 people a year,
| so it's not even 10% of that. Barely registers on the
| dashboard.
| mzs wrote:
| _... normally, they would personally devote only about "20
| seconds" to each target before authorizing a bombing -- just to
| make sure the Lavender-marked target is male. ..._
| mrs6969 wrote:
| Any human being would not accept this. If it is happening to
| Palestinian people, it will happen to any other country in the
| world. Israel is committing genocide in front of the world. 50
| years from now, some people will be sorry while committing
| another genocide.
|
| be ready to be targeted by AI, from another state, within another
| war
| mirekrusin wrote:
| Red flag for me is the part where they say it was left for human
| to decide if AI generated correct target or false positive based
| on voice recognition performed by human: (...)
| at some point we relied on the automatic system, and we only
| checked that [the target] was a man -- that was enough. It
| doesn't take a long time to tell if someone has a male or a
| female voice (...)
|
| ...sounds fake as shit. Any dumb system can make male/female
| decision automatically, no fucking way human needs to verify it
| by listening to recordings while sohphisticated AI system is
| involved in filtering.
|
| Why would half a dozen, active military offcers brag about
| careless use of tech and bombing families with children while
| they sleep risking accusation of treason?
|
| Feels like well done propaganda more than anything else to me.
|
| It's plausible they use AI. It's also plausible they don't that
| much.
|
| It's plausible it has high false positive rate. It's also
| plausible it has multiple layers of crosschecks and has very high
| accuracy - better than human personel.
|
| It's plausible it is used in rush without any doublechecks at
| all. It's also plausible it's used with or after other
| intelligence. It's plausible it's used as final verification
| only.
|
| It's plausible that targets are easier to locate home. It's
| plausible it's not, ie. it may be easier to locate them around
| listed, known operation buildings, tracked vehicles, while known,
| tracked mobile phone is used etc.
|
| It's plausible that half a dozen active officers want to share
| this information. It's also plausible that narrow group of people
| have access to this information. It's plausible they would not
| engage in activity that could be classified as treason. It's also
| plausible most personel simply doesn't know the origin of orders
| up the chain, just immediate.
|
| It's plausible it's real information. It's also plausible it's
| fake or even AI generated, good quality, possibly intelligence
| produced fake.
|
| Frankly looking at AI advances I'd be surprised if propaganda
| quality would lag behind operational, on the ground use.
| sequoia wrote:
| I'm disturbed by the idea that an AI could be used to make
| decisions that could proactively kill someone. (Presumably
| computer already make decisions that passively kill people by,
| for example, navigating a self-driving car.) Though there was a
| human sign-off in this case, it seems one step away from people
| being killed by robots with zero human intervention which is
| about one step away from the plot of Terminator.
|
| I wonder what the alternative is in a case like this. I know very
| little about military strategy-- without the AI would Israel have
| been picking targets less, or more haphazardly? I think there may
| be some mis-reading of this article where people imagine that if
| Israel weren't using an AI they wouldn't drop any bombs at all,
| that's clearly unlikely given that there's a war on. Obviously
| people, including innocents, are killed in war, which is why we
| all loathe war and pray for the current one to end as quickly as
| possible.
| readyplayeremma wrote:
| > B., a senior officer who used Lavender, echoed to +972 and
| Local Call that in the current war, officers were not required
| to independently review the AI system's assessments, in order
| to save time and enable the mass production of human targets
| without hindrances.
|
| > "Everything was statistical, everything was neat -- it was
| very dry," B. said. He noted that this lack of supervision was
| permitted despite internal checks showing that Lavender's
| calculations were considered accurate only 90 percent of the
| time; in other words, it was known in advance that 10 percent
| of the human targets slated for assassination were not members
| of the Hamas military wing at all.```
|
| So, there was no human sign-off. I guess the policy itself was
| ordered by someone, but all the ongoing targets that were
| selected for assassination were solely authorized by the AI
| system's predictions.
|
| This sentence is horrifically dystopian... "in order to save
| time and enable the mass production of human targets without
| hindrances"
| sequoia wrote:
| Hm OK, I read this a bit differently. I read these sections:
|
| > One source stated that human personnel often served only as
| a "rubber stamp" for the machine's decisions, adding that,
| normally, they would personally devote only about "20
| seconds" to each target before authorizing a bombing -- just
| to make sure the Lavender-marked target is male.
|
| > According to the sources, the army knew that the minimal
| human supervision in place would not discover these faults.
|
| I took this to mean that a human did press the "approve"
| button on the computer's recommendation. Though they make
| clear they were basically "rubber stamping" the machine
| recommendation.
|
| But to my point:
|
| > "There was no 'zero-error' policy. Mistakes were treated
| statistically," said a source who used Lavender.
|
| What is the "zero-error" alternative approach for dropping
| bombs in a war, or firing rockets for that matter? I don't
| understand the implicit comparison between this approach to
| targeting and a hypothetical approach that allows war to be
| waged without any innocents dying or buildings being
| destroyed. This system should be compared to whatever the
| _real alternative_ is when it comes to target selection.
| Again I know nothing about military strategy, I 'm hoping
| someone with more experience will speak up.
|
| To use an analogy: if we are talking about self-driving cars,
| the rates of collision or death should be compared the rates
| of collision or death in cars driven by humans. Comparing
| against some imaginary scenario where cars have no collisions
| and cause no deaths doesn't make sense.
| stonogo wrote:
| The difference is between inaccuracy of a weapon hitting a
| target and inaccuracy of _target selection_ in the first
| place.
|
| Remember the scene in Men In Black where the recruita do
| target practice? They were all accurate at _hitting what
| they shot at_ but only Will Smith 's character was accurate
| at _selecting a target_. This AI chooses targets; it does
| not fire weapons.
| astockwell wrote:
| Haha having recently rewatched MIB with my daughter after
| ~15 years, I don't think Will Smith correctly selected
| the target... :'D
| ceejayoz wrote:
| I think you very much missed the context of that scene.
|
| https://www.youtube.com/watch?v=ORHAP6duw9E
|
| The job is not "shoot aliens". It's _manage_ aliens,
| including Earth 's population of legal resident aliens
| (like the taxi driver who he delivers a baby for). The
| Big Bad of the film is indeed posing as a human, and
| Smith's character runs into an endless procession of
| innocent (or at least non-capital-crime) aliens he
| _should not_ shoot along the way.
|
| There's a reason he gets hired over all the military
| folks in the scene immediately blasting away at the
| aliens in the shooting range.
| YeGoblynQueenne wrote:
| >> There's a reason he gets hired over all the military
| folks in the scene immediately blasting away at the
| aliens in the shooting range.
|
| Yes, because he's Will Smith.
| mandmandam wrote:
| No. There's an important point being made here.
|
| Not to be the tone police or anything, but a HN
| discussion of AI-powered mass murder really isn't the
| time to be glib.
| ok_dad wrote:
| Later in the article they talk about how they specifically
| approved up to 15-20 civilians to die with those marked
| individuals and would bomb their homes as a first option.
|
| I'm disgusted by this, I don't care anymore what happened
| in October, this needs to stop. Israel government cannot be
| trusted to run this war, it's turned into genocide and
| we're all complicit letting them do it and supporting them.
| I can't believe people actually support this, it's clear
| they've forgotten Palestinians are people.
| gambiting wrote:
| Israeli officials are constantly being asked "how many
| dead palestinians is too many" in this conflict, and the
| answer has explicitly been "there is no such thing" way
| too many times. There is no upper limit on how many
| people can be killed to further their goals.
|
| The most upsetting(for me) thing is reports of all the
| kids killed by snipers and just in general, as a father I
| cannot imagine losing my child to this.
|
| https://www.theguardian.com/world/2024/apr/02/gaza-
| palestini...
|
| https://www.theguardian.com/commentisfree/2024/mar/23/isr
| ael...
| lupusreal wrote:
| > _Israeli officials are constantly being asked "how many
| dead palestinians is too many" in this conflict, and the
| answer has explicitly been "there is no such thing" way
| too many times_
|
| That's because what they're doing is ethnic cleansing,
| calculated to be just slow enough to not bug out domestic
| and foreign (particularly American) support.
| Angostura wrote:
| Has any combatant in any armed struggle ever given a
| clear answer to that question?
| gambiting wrote:
| Journalists aren't asking combatants - they are asking
| politicians in the Israeli government.
|
| Because there has to be a number, right? Is 30k dead
| palestinians too many? Is 50k? is 200k? How about all of
| them?
| roywiggins wrote:
| If you multiply out the number of targets that Lavender
| generated by the number of acceptable civilian deaths per
| target, you get a number that is ~40% of all Gazans.
| digging wrote:
| > What is the "zero-error" alternative approach for
| dropping bombs in a war, or firing rockets for that matter?
|
| Honestly, I'm not sure. Obviously humans make errors of all
| sorts as well, and even make intentionally unethical
| decisions.
|
| I think the horror of this situation is that it makes war
| easier to wage. Accepting that all war has costs measured
| in blood, we should want less war. However, those in
| control of military forces always have incentive to wage
| war, so removing friction from the process is dangerous.
|
| Off-topic of AI, but on-topic of your question:
|
| The actual alternative to unleashing AI assassination is
| not human-selected targets, but _not waging war_. It isn 't
| necessary to destroy Hamas with violence, it would have
| worked better to give Palestinians dignity and self-
| determination long ago. That can still work, although until
| it does Hamas will continue to be a problem. But as I said,
| war is useful for the political leaders of Israel, so they
| stoked and fed the flames for decades to maintain an excuse
| for the war machine.
| ethanbond wrote:
| Eh, some people actually have different visions for the
| world. They'll elect people who are abhorrent to western
| liberal values over and over again. I don't know what a
| new election in Gaza would yield, but I don't think it
| can be a given that giving X group dignity and self-
| determination will _necessarily_ tilt them toward western
| liberal outcomes.
| BriggyDwiggs42 wrote:
| I don't think israeli policy is or has been particularly
| effective in expanding western liberal values to
| palestinians. I'd argue putting people under such
| pressure provides the exact opposite incentives.
| ethanbond wrote:
| I didn't claim otherwise.
| digging wrote:
| > tilt them toward western liberal outcomes
|
| Fortunately, this is not what I'm hoping for! I'd much
| rather see another Rojava than another Western
| plutocracy.
| ethanbond wrote:
| > The AANES [Rojava] has widespread support for its
| universal democratic, sustainable, autonomous pluralist,
| equal, and feminist policies in dialogues with other
| parties and organizations. Northeastern Syria is
| polyethnic and home to sizeable ethnic Kurdish, Arab, and
| Assyrian populations, with smaller communities of ethnic
| Turkmen, Armenians, Circassians, and Yazidis.
|
| > The supporters of the region's administration state
| that it is an officially secular polity with direct
| democratic ambitions based on democratic confederalism
| and libertarian socialism promoting decentralization,
| gender equality, environmental sustainability, social
| ecology, and pluralistic tolerance for religious,
| cultural, and political diversity, and that these values
| are mirrored in its constitution, society, and politics
|
| So... you want a western liberal outcome?
| digging wrote:
| Oh, you meant human rights and all that? Having ideals
| and ethics? Yes, that would be my hope. I thought you
| were referring to the neoliberal hegemony of wealthy
| Western nations.
| ethanbond wrote:
| Yes, correct. Human rights is a liberal concept.
| Pluralism is a liberal concept. Secularism is a liberal
| concept. There are in fact _lots_ of people who actually
| literally disagree with these ideals. Lots of 'em in the
| Middle East, in fact, which is why you cannot assume that
| merely lifting the oppressor's thumb would yield the
| outcome that's so intrinsically appealing to your
| sensibilities that you're struggling to even identify it
| as an _opinion_ that you hold and that others may not.
|
| No, I was referring to western liberalism that's why I
| used the term western liberalism not "neoliberal hegemony
| of wealthy Western nations."
| davidf18 wrote:
| Palestinians were given opportunities for self-
| determination in 1948, 2000 (Camp David), 2008, and 2006
| in Gaza (blockaded by Egypt because of Hamas elected to
| run Gaza). In 1948, they along with 5 invading Arab
| countries tried to destroy Israel, resulting in their own
| destruction of their Arab state. In 2000, Arafat turned
| down a peace agreement with Bill Clinton starting
| terrorism that resulted in 3000 Palestinian and 1000
| Jewish and Israeli Arab deaths, in 2008 Abbas turned down
| a peace agreement.
|
| After 10/7 almost every Israeli knows that the
| Palestinians are not interested in their own state.
|
| Of the 32,000 Hamas stated deaths, 13,000 are terrorists,
| thus resulting in a far lower civilian-to-combatant death
| ratio than in other urban conflicts such as Mosul.
|
| The lesson learned with Japan in Germany in WW II is that
| total military defeat is necessary. The AI technology
| enables the targeting of all terrorists, not only senior-
| level terrorists as before, resulting in a quicker end to
| the conflict than otherwise and thus resulting in fewer
| civilian deaths.
|
| As we know these terrorists hide among civilians
| including in and under hospitals, making these legitimate
| targets. The high number of civilian deaths occur from
| the terrorists hiding among civilians.
| C6JEsQeQa5fCjE wrote:
| > Of the 32,000 Hamas stated deaths, 13,000 are
| terrorists
|
| 13k out of 32k is around 40%. The estimates for the
| number of murdered children and women have been about 70%
| [1] for months, so the "40% are terrorist" claim already
| does not match that unless women and children are counted
| as terrorists. Anyway, even going with only 60% of those
| murdered being women and children, that still implies
| that every single killed male person is a terrorist. Now,
| I am sure that IDF already presents this as true in order
| to justify the murders, but that will not pass basic
| logical scrutiny of any critically-thinking person.
|
| [1] 2024, March 14, https://www.msnbc.com/top-
| stories/latest/death-toll-children...
| nsguy wrote:
| Since you went off topic. If Palestinians only wanted
| dignity and self-determination this conflict would have
| been resolved a long time ago. Palestinians, broadly
| speaking, want Israel removed from the map. This is why
| they're chanting "from the river to the sea" which
| happens to include the area Israel is situated in.
|
| During the Oslo peace process, when Israel was trying to
| address this in the way you propose, Hamas launched a
| suicide bombing campaign against Israeli civilians:
|
| https://en.wikipedia.org/wiki/List_of_Palestinian_suicide
| _at...
|
| https://en.wikipedia.org/wiki/Oslo_Accords
|
| You can be critical of everything Israel does, in this
| war or ever - fine. But the Palestinians have no other
| accepted settlement other than shipping ~8 million Jews
| to Europe or killing them.
|
| The people who suddenly developed this simplistic
| understanding of occupation/resistance/occupier have no
| idea what they're talking about. Often quite literally in
| the sense they don't even understand the meaning of what
| they're saying, not to mention the history of Israel or
| the middle east.
| runarberg wrote:
| We also have to be open to the possibility that Israel is
| committing a genocide and the goal is to kill as many
| Palestinians as possible and terrorize the rest. That the
| AI system's main purpose isn't to be accurate in selecting
| target, but rather to manufacture a reason to kill more
| Palestinians than a human ever could. Another function
| could be to remove accountability from a targeting officer.
| Zero-error is never really a desired feature, in fact zero-
| error would be a bug, as it would prevent the genocide
| being conducted efficiently.
|
| What we may be witnessing is the first information age
| level genocide, where the killing is done at the behest of
| a statistical function with near infinite computing power.
| YeGoblynQueenne wrote:
| >> Comparing against some imaginary scenario where cars
| have no collisions and cause no deaths doesn't make sense.
|
| That's not the whole story. For example, we ban certain
| kinds of weapons -cluster munitions, chemical weapons,
| biological weapons, ideally we'd ban bloody mines- not
| because they kill too many people compared to
| "conventional" weapons (they don't) but because they are
| considered especially ... well, wrong, in the moral sense.
|
| So maybe we decide that being killed by a machine, that
| decides you're a target and pulls the trigger autonomously
| is especially morally wrong and we don't accept it.
| jhallenworld wrote:
| I imagine you get to tune the probability window of "person
| is >90% likely a Hamas terrorist" and choose how many
| innocent people you kill. Who set the window?
|
| "Hamas terrorist" criteria: a male of fighting age, give
| higher weight to those congregating with others of fighting
| age. Basically take out a generation of Palestinian men and
| you're all set. Lovely.
|
| >This sentence is horrifically dystopian... "in order to save
| time and enable the mass production of human targets without
| hindrances"
|
| Reminds me of similar industrial thinking of a certain
| previous fascist government.
| asadalt wrote:
| ...and then target them at home along with their entire
| family.
| Joker_vD wrote:
| But you see, if you kill just them, then their family
| would very likely get radicalized because of that, and
| then you'd have to kill them too, only some time later so
| it's just more efficient to do it in one fell swoop while
| you have the chance.
|
| Of _course_ it 's perfectly ethical, why do you ask?
| BriggyDwiggs42 wrote:
| See i found this old book by this machiavelli guy that
| sums up our approach perfectly. He was really onto
| something here.
| logicchains wrote:
| Chinese emperors were doing this looong before
| Machiavelli.
| cm2187 wrote:
| A war that would only kill 10% civilians would be a massive
| improvement over any recent conflict.
| ernado wrote:
| Isn't it close to Russian-Ukrainian war ratio?
| A_D_E_P_T wrote:
| That ratio is by all estimates lower than 10%.
|
| UN Estimates, as of March 1st, are "10,675 [civilians]
| killed, 20,080 wounded" -- _on both sides._
|
| The number of soldiers killed on both sides (combined) is
| certainly no less than 100k, and might even exceed 400k.
|
| In Gaza, more than 25,000 civilians have already been
| killed. https://news.un.org/en/story/2024/01/1145742
|
| This is a callous, inexcusable massacre. By comparison
| with the Israelis, the Russians look like "gentle and
| parfait knights." But the former are presumably on our
| side, and the latter are our geopolitical opponents. So.
| dralley wrote:
| That's not true. The UN themselves state that their
| numbers for Ukraine are likely severely undercounting the
| total casualties simply because they don't have any
| insight into what is going on in occupied territory. They
| do not give "estimates" for Ukraine, the numbers are what
| they have been able to confirm. So for you to call that
| very specific number an "estimate" is incorrect - which
| should probably have been self-evident.
|
| >> The U.N. human rights mission in Ukraine, which has
| dozens of monitors in the country, said it expects the
| real toll to be "significantly higher" than the official
| tally since corroboration work is ongoing.
|
| https://www.reuters.com/world/europe/civilian-death-toll-
| ukr...
|
| https://www.reuters.com/world/europe/more-
| than-8000-civilian...
|
| There are more than 10,000 fresh graves in the city of
| Mariupol alone and many of them appear to contain
| multiple bodies - which was the case in other graves
| uncovered in places like Kherson and Lyman.
|
| https://apnews.com/article/russia-ukraine-war-erasing-
| mariup...
|
| The actual civilian death toll is almost certainly in the
| tens of thousands, not a singular ten thousand.
|
| Also consider the death toll caused by the withholding of
| medical assistance to those who refuse to take Russian
| citizenship, and the flooding caused by the destruction
| of the Nova Khakovka dam.
| A_D_E_P_T wrote:
| Other sources also have the number at around 10k
| fatalities. For e.g., the Harvard Kennedy School:
| https://www.russiamatters.org/blog/russia-ukraine-war-
| report...
|
| Perhaps the number is higher. What's your best estimate
| for the number of civilian casualties in Ukraine? How
| about military casualties on both sides?
|
| And, quibbling over numbers aside, surely you can see
| that the nature of the war in Gaza and the war in Ukraine
| are very different. In Gaza, civilians are taking the
| brunt of the fighting. Ukraine, in contrast, is hell for
| soldiers, but civilians and aid workers are _generally_
| moved away from the front, and they 're more rarely
| treated with the wanton disregard and disdain that Gazans
| suffer.
|
| To all appearances, what's happening in Ukraine is a war,
| fought by and large by the accepted rules of war. In
| contrast, I don't think that Israel is fighting a war;
| they're marauding and taking shots at a densely populated
| civilian enclave that refuses to surrender to them
| unconditionally.
| dralley wrote:
| That's not a source, it's a link back to the very same UN
| figures I just explained the problem with. Literally if
| you follow the citation on that page for that section, it
| goes straight back to the UN report, which explains how
| each casualty was corroborated (NOT estimated.
| independently verified.)
|
| >And, quibbling over numbers aside, surely you can see
| that the nature of the war in Gaza and the war in Ukraine
| are very different. In Gaza, civilians are taking the
| brunt of the fighting.
|
| I do not see the difference between Gaza and Mariupol,
| except that the population of Mariupol is older and the
| temperatures drop below freezing for months of the year.
| It was carpet bombed, residential areas were shelled,
| there were reports of civilians needing to drink water
| from puddles, incidents of torture and murder,
| practically the entire city was destroyed.
|
| >To all appearances, what's happening in Ukraine is a
| war, fought by and large by the accepted rules of war. In
| contrast, I don't think that Israel is fighting a war;
| they're marauding and taking shots at a densely populated
| civilian enclave that refuses to surrender to them
| unconditionally.
|
| With all due respect I do not see how you can possibly
| think this unless you've been ignoring much of what has
| been happening in Ukraine.
|
| One example of many: https://www.wsj.com/video/series/in-
| depth-features/images-sh...
|
| Another: https://www.reddit.com/r/CombatFootage/comments/
| te9kvd/khark...
|
| Another: https://www.reddit.com/r/CombatFootage/comments/
| t5s44r/cctv_...
|
| Another: https://www.reddit.com/r/CombatFootage/comments/
| t4rfgy/russi...
|
| Hospital hit with a 1500kg bomb: https://www.reddit.com/r
| /CombatFootage/comments/170fues/russ...
|
| Russians using a Ukrainian POW as a human shield during
| an attack: https://www.reddit.com/r/CombatFootage/comment
| s/1azri7n/russ...
|
| Russians using 3 Ukrainian POWs as human shields during
| an attack: https://www.reddit.com/r/CombatFootage/comment
| s/18hnvkx/clai...
|
| You don't want me to share the video of Russians
| executing 9 Ukrainian POWs with their hands behind their
| backs, the video of Russians castrating a Ukrainian POW
| and then executing him, or the video of Russians
| decapitating a Ukrainian POW slowly with a knife.
|
| And Bucha, and the Nova Khahovka dam, and the torture
| chambers, and the air campaign designed in the Russians
| own words to freeze Ukrainians over the winter, and the
| mass graves in Lyman where raped and murdered women and
| tortured Ukrainian men were discovered. And the
| Kramatorsk railway station attack. And the Kremenchuk
| shopping mall attack.
|
| Literally yesterday the Russians hit an elementary school
| in Dnipro with ballistic missiles, the only reason it
| wasn't a mass casualty event was that they had 5 minutes
| warning to evacuate to bomb shelters.
|
| This is literally just what I can remember off the top of
| my head.
| A_D_E_P_T wrote:
| Sure, fine, maybe the UN report is all wrong -- even
| though everybody seems to use it.
|
| What's _your_ best estimate of civilian + military
| casualties in Ukraine, with whatever supporting evidence
| you care to muster?
|
| _Edited to add:_
|
| You've edited and added to your post after my response.
|
| In response to your Reddit links, I think that they
| distract from the main point, which is that the Gaza war
| has disproportionately affected civilians, even in
| comparison with the worst of Ukraine's battlegrounds.
|
| Ukraine has depth, and not only can its civilians move
| west to cities such as Lvov, its citizens have been
| invited into Europe.
|
| In contrast, Gaza is a sprawling low-rise cityscape with
| a population of 14,000 people per square mile -- far in
| excess of anything in Ukraine; nearly double Kiev's
| population density -- and Gazans are, for the most part,
| forbidden from leaving. Egypt can't take them, save in
| special circumstances. All the privation of war is felt
| by this civilian population -- and, at least to an
| extent, this is used by Israel as a weapon.
|
| Russia, for all its faults, has a straightforward
| strategy and straightforward, even realistic aims. I
| don't think you can say the same for Israel. It's just
| wild.
| lupusreal wrote:
| For every militant they correctly identify (90% of the
| time, they'd have us believe) and kill, they also kill
| dozens of innocents. This doesn't give them pause; on the
| contrary the Israeli public revels in the carnage and
| bring out lawn chairs to watch. It's genocide.
| monocasa wrote:
| Well, that was a 10% failure rate supposedly on selecting
| the primary target of the attack.
|
| The attack itself was allowed to have a 15x to 100x
| number of civilians killed depending on the supposed
| importance of the target.
| underlipton wrote:
| >Basically take out a generation of Palestinian men and
| you're all set.
|
| Now that we've established that this is horrific, please
| turn a small portion of your attention to American
| predictive policing systems (digital and not) and the
| circumstances that lead to mass incarceration (including
| the War on Drugs).
| manquer wrote:
| 90% is a BS number . Computed basis what ? What is the
| baseline how did they benchmark . Is there any data
| whatsoever to back this claim ?
|
| They just spout a high number that is not 100% (clearly
| civilians are being killed publicly undeniably ) claiming
| 100% would be too obviously ridiculous.
|
| More than half of 32,000+ (more under the rubble) killed are
| woman and children, Hamas is still quite able to fight,
| hardly any hostages has been recovered .
|
| Israel labels any sort of civilian organization as hamas
| including journalists, medical and aid staff. 200 UN staff
| and 100 journalists are dead so far . Israel's argument is
| UNWRA terrorist aiding and journalists were also secretly
| Hamas and doing non journalistic stuff when killed so they
| include them in legitimate targets .
|
| If you consider everyone is Hamas unless otherwise proven
| then 90% is possible .
|
| There is no realistic way an algorithm was designed factoring
| in the level of destruction of infrastructure never seen in
| any real world data and also benchmarked accurately.
| nickpsecurity wrote:
| I don't like Lavender. I think humans should always be in the
| loop. I'd like to see more care by analysts for kill orders.
|
| That said, any organization might do something if it's 90%
| accurate. Assuming it even is (doubt it), I think any fair
| evaluation of such a technology must ask:
|
| What is the accuracy of inexperienced humans in the same
| position who are rushing through the review during a blitz
| invasion? If they have battle experience, what about them,
| too? (I'm assuming most won't.)
|
| Is the system better than those humans or worse? How often?
|
| Do the strengths and weaknesses of the system allow
| confidence scores on predictions to know which need more
| review? Can we also increase reviews when the number of
| deaths will be high?
|
| That's how I'd start a review of this tech. If anyone is
| building military AI, I also ask that you _please_ include
| methods to highlight likely corner cases or high-stakes
| situations. Then, someone's human instincts might kick in
| where they spot and resolve a problem even in the heat of
| war.
| amenhotep wrote:
| It is very clear to me that that is a sentence reflecting the
| editorial interpretation of the paper rather than a direct
| quote. You might agree with the interpretation - I think I
| might - but that is very different from this specific
| sentiment being something Israeli leadership are openly
| saying.
| verisimi wrote:
| In war, the first casualty is the truth.
|
| We have no idea whether this story itself is relaying anything
| of value. For all we know, stories like this could be a part of
| the war effort.
| ahmadss wrote:
| What does that even mean? 972 is a local Israeli outfit, with
| contributors from Israel and Palestine. They have sources
| within the IDF, sources who may be center-left leaning and
| are "done" with how the far-right coalition is running this
| war and they are blowing the whistle on this practice.
| edanm wrote:
| 972 is _very_ far left, at least compared to the standard
| Israeli position, I believe. I 'm happy they're reporting,
| but they have a very obvious bias, and I'd take anything
| they say with a huge amount of caution.
| FireBeyond wrote:
| > 972 is very far left, at least compared to the standard
| Israeli position, I believe.
|
| Netanyahu, who has been PM of Israel on 3 occasions, for
| 16 years, and was one of the people responsible for a
| policy of funding and arming Hamas (so Israel didn't have
| to answer awkward questions like "Arafat and the PLO are
| willing to come to the peace table and make a two state
| solution work, why aren't you?"), figuring it better to
| have an extremist opponent than a moderating one is
| categorized as being from very right wing to extreme
| right wing.
|
| So I would say that the very vast majority of reporting
| is probably left to far left of Netanyahu and his party
| position. That doesn't obviously discount their remarks,
| let alone your implication that by default, we should
| assume their words might not be accurate.
| edanm wrote:
| > So I would say that the very vast majority of reporting
| is probably left to far left of Netanyahu and his party
| position.
|
| 972 isn't just left of Netanyahu or his current
| government, which you correctly categorize as extreme
| right IMO. They are far left of almost all Israelis, many
| of whom are centrists (with not a few more left-wing
| citizens). As far as I can tell, they are far to the left
| of Haaretz, which is the more standard olg-guard left-
| leaning newspaper in Israel.
|
| > That doesn't obviously discount their remarks, let
| alone your implication that by default, we should assume
| their words might not be accurate.
|
| I was implying they are inaccurate not because they lean
| left, specifically, but because they are very _biased_. I
| don 't particularly trust their reporting, because in the
| few times I've read any of it, it's been fairly clear
| that they are interpreting almost everything in a way
| that is maximally "anti-Israel". That doesn't mean they
| automatically shouldn't be trusted, but they shouldn't
| automatically be trusted either.
| whearyou wrote:
| Given what you're saying is recognized throughout the
| Jewish world, your downvotes indicate a serious
| predispositions for the visitors of this forum
| pelasaco wrote:
| I never heard of 972 but at first look it doesn't look
| neutral. Everyone in Germany knows that this is not well
| pictured https://www.972mag.com/germany-israel-palestine-
| solidarity-r...
| yrro wrote:
| "A computer can never be held accountable, therefore a computer
| must never make a management decision"
|
| The IDF only read the first half of the classic IBM slide!
| tmnvix wrote:
| > I wonder what the alternative is in a case like this.
|
| It seems obvious to me that the alternative would be a slower
| process for picking targets leading to fewer overall targets
| picked and the guarantee that a human conscience is involved in
| the process.
| ranger207 wrote:
| Or alternatively pressure from the top down on targeting
| specialists to get more and more targets selected resulting
| in less quality and effort spent on selecting targets and
| maybe leading to rubber-stamping proposed targets without
| adequate consideration. Which isn't to suggest that that
| would definitely make the AI better per se
| cess11 wrote:
| It's an army too cowardly to have dismounted infantry
| protecting their tanks, so instead their conscripts burn
| alive in there when they get in contact with actual
| militants.
|
| It's an army incompetent enough to recreate the rubble of
| Stalingrad to help its enemy.
|
| How would they go about producing officers that could enact
| such pressure? How would they recognise the difference
| between a specialist and a charlatan whos family is good
| friends with the army rabbi?
| jsmith99 wrote:
| The weirdest thing about this bizarre comment is the
| suggestion that rabbis have any influence on the Israeli
| army.
| robbomacrae wrote:
| Disturbing indeed. I've been worried a push back in AI is
| coming and this sort of story could be a tipping point and
| certainly would justify a period of reflection.
|
| And your probably right that the alternatives maybe worse, the
| folks behind Lavender could probably even prove it with data..
| but there should be a moral impetus to always have a human in
| the loop regardless. And any such attempt to justify it won't
| capture the publics attention like a sky-net doomsday happening
| over the civilians in Gaza.
| coffeebeqn wrote:
| Pushback on AI will of course have a "National security"
| exception. If the industrial level facial recognition tech in
| Xinjiang was forgotten I doubt this will make a difference
| wruza wrote:
| _there should be a moral impetus to always have a human in
| the loop regardless_
|
| I don't understand how to come to this. War is crap, not a
| dinner party. There's always a human on both sides who will
| drop a bomb and laugh on camera, with no responsibility. Go
| watch it (actually don't, it's NSFL). Reading this thread
| feels like everyone watched and believed in that movie where
| they tried to select and eliminate a target for 2 hours with
| futuristic hi-tech. A human hesitates to press the button
| before the war. When in it, he will only be concerned with
| things like ammunition saving and tactical nuances. There's
| not much more morals in a human who usually sits there at the
| button than in AI automation.
| outside1234 wrote:
| The thing that is different is now that human has an
| excuse: "The computer told me to put them in the oven."
| solardev wrote:
| Is having a human make those decisions really better? It was
| humans who ordered the Holocaust, My Lai, Wounded Knee, Rwanda,
| Tiananmen, etc.
|
| At least AI pretends to look at some data instead of just
| defaulting to tribal bloodlust... who's to say it can't be more
| ethical? It doesn't take much to beat our track record.
| cm2187 wrote:
| I think people are worried no one really understands how AI
| picks the target.
|
| Reminds me of that story from probably 5-7y ago. Someone
| wanted to use AI to classify photos of tanks as soviet vs US.
| So he went to a US tank museum and took lots of pictures of
| the tanks under every angle. Did the same in a soviet tank
| museum. The resulting model worked great on that training
| dataset. Then he tried on photos outside of the training
| dataset. Turned out that it was cloudy the day he visited the
| US museum and sunny for the soviet museum, and the model used
| the color of the sky to classify.
| artificial wrote:
| Seems like segmentation would be a better approach to
| identity objects in a photo rather than various other
| features.
| ben_w wrote:
| An eternal story; I heard the same thing at university 22
| years ago, except then it was NATO taking nice crisp in-
| focus photos of their own tanks from close up, while the
| images of Soviet tanks were all blurry and grainy because
| they came from high-altitude spy planes.
|
| (This kind of human model hallucination is how and why I
| think Genesis got written and taken seriously).
|
| https://gwern.net/tank
| solardev wrote:
| > I think people are worried no one really understands how
| AI picks the target.
|
| Yeah, I mean, black-box murder is never really desirable...
| but is it fair to assume AI will never be able to elucidate
| its reasoning? And that also seems a bit of a double
| standard, when so many life-and-death decisions made by
| humans are also not entirely comprehensible or transparent,
| either to the general public or sometimes even to the other
| individuals closest to the decision-maker.
|
| Sometimes it's a snap judgment, sometimes it's a gut
| feeling, sometimes it's bad intel, sometimes it's just
| plain "because I said so"... not every kill list is the
| result of a reasoned, transparent, fair and ethical
| process.
|
| After all, how long have Israel and Hamas (or other groups)
| been at each other's throats, with cries of injustice and
| atrocities about either side, from observers all over the
| world? And it wasn't so long ago we destroyed Afghanistan
| and Iraq, and Russia is still going at it because of the
| desires of _one man_. AI doesn 't have to be perfect to be
| better than us.
|
| If there's one thing humans are really, really bad at, it's
| letting objective data overrule our emotional states. It
| takes exceptional training and mental fortitude to be able
| to do that under pressure, especially life-and-death, us-
| vs-them pressure.
|
| Humans make mistakes, too, and friend-or-foe identification
| isn't easy for humans either, especially in the heat of
| battle or in poor visibility. Training for either humans or
| AI can always be improved, but probably will never reach
| 100% accuracy.
|
| Maybe we should start putting some hypothetical kill lists
| in front of both humans and AI, recording their decisions,
| and comparing them after a few years to see who did
| "better". I wouldn't necessarily bet on the humans...
| jhallenworld wrote:
| Perhaps it performs sentiment analysis of your social media
| posts.
| solardev wrote:
| So _that 's_ why I'm still alive. Hi, robo-overlords! Sarah
| Connor sux. Save me for last!
| cess11 wrote:
| What happens when you put a computer in front of the judges
| at the ICC?
| solardev wrote:
| They ask an assistant for help?
|
| Run it through some panel of experts and demand algorithm
| changes?
|
| Send it to some Judge API and get back some JSON?
|
| I dunno, what?
|
| They're not exactly very good at preventing or punishing
| human atrocities, either... it's more of a symbolic group,
| or a tool of the victors, than anything resembling actual
| justice. I'd argue textbook authors have more of a lasting
| ethical impact than the ICC.
| pvaldes wrote:
| when a computer program designed by a human "makes" the
| decision, humans can claim that it was "a funny mistake", it
| was not their fault and pretend to be very sad for it.
|
| Having a human to make those decisions is better because this
| human can be judged if commits war crimes or genocide or
| violates international war laws.
|
| A computer can't be jailed and this is the real power of
| designing this system. To hide the criminals on a black box
| so nobody can be made responsible
| t_serpico wrote:
| Exactly my thoughts, the AI shields all responsibility from
| the humans.
| solarpunk wrote:
| What if the AI was trained on data collected and assembled by
| someone with "tribal bloodlust".
| cm2187 wrote:
| We crossed the line of machines that automatically kill a long
| time ago. A heat seeking missile, or a shell that detects and
| target tanks [1] are effectively doing that. Software selects
| the target. The soldier only points in the general direction.
| AI is only a small technical increment.
|
| [1] https://en.wikipedia.org/wiki/SMArt_155
| LordShredda wrote:
| But you know soldiers are in a tank, and you know a pilot is
| in a plane. Who's in an apartment?
| solardev wrote:
| It's never really that clear-cut, though. Human drone
| operators, pilots, etc. routinely send missiles into cars,
| buildings, weddings, etc. that cause collateral damage,
| killing or maiming innocents and passers-by. Sometimes it's
| an accident, but not always.
|
| And that's just when we even _try_ to limit damage, vs
| indiscriminately firebombing or nuking entire cities.
|
| We shouldn't demand perfect accuracy of AI when we don't
| expect the same of humans. Long ago, we decided collateral
| damage in war is acceptable, especially when you end up
| winning the war and there's nobody left to prosecute you
| except historians =/
| underlipton wrote:
| >I wonder what the alternative is in a case like this.
|
| Don't Create The Torment Nexus
|
| I think that once you start from the viewpoint that you're not
| going to create the Torment Nexus, it becomes a lot easier to
| avoid creating the Torment Nexus.
| 0x457 wrote:
| This system bassicaly just gave everyone a score from 1 to 100
| of how luckely they are part the military wing of hamas.
|
| Another system would signal that target is at home and it's
| time to bomb. This system was using phone to geo-locate and due
| to nature of living in Gaza phones transfer hands often.
|
| Without Lavender they would have dropped less bombs IMO.
| BriggyDwiggs42 wrote:
| Look I know this is gonna sound cliche but the thing they
| should do is not engage in an offensive asymmetrical war and
| bomb a dense urban area full of innocents for basically no
| reason. Then they wouldn't need the little ai.
| sequoia wrote:
| This is obviously veering way off course of the topic of AI
| at this point, but I imagine the residents of kibbutz be'eri
| and the 100+ hostages still held in Gaza would disagree that
| Isreal is fighting for "basically no reason." I'm interested
| in analysis and criticism of Israel's use of AI in this case
| but suggesting Israel has no _causus belli_ is absurd.
| mandmandam wrote:
| OP didn't say Israel is fighting for basically no reason.
|
| You're twisting their words, I'll assume out of a
| misreading. Read the comment again. They clearly said that
| there's no good reason to bomb Gaza the way that they have
| been doing, resulting in the murder of thousands and
| thousands and thousands and thousands of civilians.
| kromem wrote:
| There would have been slower target selection.
|
| A lot of news around the bombing called out the uniquely large
| scale and rapidity of the campaign.
|
| This was a preview of future conflicts.
|
| We're entering the WWI phase of new technology being brought
| without rules to conflicts where the abuses will be horrific
| until rules are finally put in place.
| emadabdulrahim wrote:
| You claim to be disturbed by reading this extensive report.
| Yet, you're saying "I wonder what the alternative is", and you
| call what's happening a "war". That's already telling that you
| have no clue what's going on. and you've fallen a gullible
| victim to Western media.
|
| Israel is a terrorist state that is not engaged in a "war"
| against Hamas. Israel is conducting a full scale genocide
| against the Palestinians in Gaza. Israel has absolutely NO
| legal grounds to "defend" itself against a territory it is
| occupying and has been occupying for 75. And have had a
| blockage on Gaza since 2006. Israel has been murdering
| Palestinians in the hundred a year and no one was batting an
| eye or holding them accountable to anything.
|
| And now you say what's the alternative... and claim to be
| "disturbed" by their use of AI, but you don't realize they, the
| Zionist state, has absolutely no rights whatsoever to harm a
| Palestinian, let alone murder, let alone starve, let alone use
| AI to massacre thousands of children, men, and women.
|
| Wake up.
| leke wrote:
| Kind of speculation at this point, but I wonder if Lavendar was
| involved in the recent killing of the World Central Kitchen Aid
| workers.
| tmnvix wrote:
| May have been involved, but I believe in that case there was an
| explicit human decision made after referring to a senior. I
| recall somebody quoting an official to this effect.
| jhallenworld wrote:
| This should be asked explicitly.
| peeters wrote:
| It seems unlikely based on what has been revealed about the
| system. It seems like Lavender is a classification AI that
| plugs static details about a person into a NN of some sort and
| spits out a score of how likely they're to be involved with
| Hamas. Score above a certain threshold and your home becomes a
| target for a dumb bomb.
|
| The World Central Kitchen attack appears to have used smart
| munitions (missiles from a drone) on a mobile truck.
| worddepress wrote:
| Morally it doesn't up the ante of course, they are already well
| into a genocide. But optically killing westerners, especially
| when they are clearly doing aid and you can't throw "they
| terrorist" shade on it. The World Central Kitchen incident has
| increased the strength of the platitudes coming from other
| countries. But not seeing any arms or trade sanctions yet, and
| no "pausing of funds while we investigate" type stuff reserved
| for anyone supporting Gaza people.
| rightbyte wrote:
| How does this system get the input? Are Palestinians using IDF
| tapped cell towers? Or is it possible to use roaming towers for
| this? Is e.g. Google or Facebook involved on a mobile OS or app
| level? Maybe backdoors local to the area?
|
| It seems like the whole cell phone infrastructure need to be torn
| down.
| jhallenworld wrote:
| My guess: facial recognition. It's easy, if you're a male of
| fighting age you're a Hamas terrorist.
|
| The social media input is terrifying: show any Palestinian
| sympathies (sentiment analysis) in your posts and you're on the
| list.
| rightbyte wrote:
| That does not explain how the IDF know that the victims are
| at home. You'd more or less need security cameras for that.
|
| I guess you can do some sort of common principal component
| analysis (CPCA) from known Hamas persons to create some sort
| of cluster based on cell phone location data or call data,
| somewhat like Spotify does with recommendation from "common
| songs".
|
| I wonder if this might explain why so many journalists are
| killed, since they probably call Hamas leaders and meet them
| a lot more than most people in the data set.
| monocasa wrote:
| > Are Palestinians using IDF tapped cell towers?
|
| That's my understanding. That the whole of the Gaza strip is
| essentially watched under the equivalent of stingrays and all
| traffic out is monitored with room 641a style taps.
|
| https://en.wikipedia.org/wiki/Stingray_phone_tracker
|
| https://en.wikipedia.org/wiki/Room_641A
| someotherperson wrote:
| Hmm, I wonder if that is related to why the use of 3G barely
| just rolled out and why they still aren't allowed to have 4G.
| Maybe that would require an upgrade of Stringray-like
| equipment?
| scotty79 wrote:
| It's so terrible to be a human shield, in a conflict, whose life
| neither side values.
| asadalt wrote:
| Lavender: This generations gas chamber.
| scotty79 wrote:
| These descriptions are chilling. The mechanistic theme of
| efficiency is reminiscent of deathcamps.
|
| We can kill more. Feed us targets. We can do it cheaply and fast.
| 10-20 civilians per one speculative target is acceptable for us.
| scotty79 wrote:
| Apart from all the horribleness and knowingly mudering civilians
| the idea of a 9to5 soldier that performs military activity then
| goes home to his family, well within range of weapons and
| intelligence of the enemy and expecting he and his family will be
| safe there while he sleeps is a bit insane. I can't imagine any
| army hellbent on winning fast would pass up on that opportunity.
|
| USA didn't exactly have much stricter conditions or way better
| accurancy of their intelligence. They did nothing qualitatively
| different. They just labeled anyone in the blast radious as
| unknown enemy combatants in the reports. And USA never had to
| operate at this volume. I guess that's just how modern war looks
| from the position of superior firepower.
| kayodelycaon wrote:
| I wonder if this explains why is seems like they are constantly
| hitting random targets in addition to everything else.
| dartos wrote:
| Don't militaries use statistical models all the time?
|
| Is this any different?
| tmnvix wrote:
| Given the total failure to achieve any of its stated objectives,
| has this use of AI benefited the IDF at all?
|
| I would argue that it's likely the only outcome it has had that
| directly relates to IDF objectives has probably been negative
| (i.e. the unintended killing of hostages).
|
| Sadly, I think that the continued use of this AI is supported
| because it is helping to provide cover for individuals involved
| in war crimes. I wouldn't be surprised if the AI really weren't
| very sophisticated at all and that to serve the purpose of cover
| that doesn't matter.
| steviedotboston wrote:
| > Given the total failure to achieve any of its stated
| objectives, has this use of AI benefited the IDF at all?
|
| Hamas has been considerably diminished. It's not accurate to
| say the war has been a "total failure".
| tmnvix wrote:
| Politically and diplomatically, it could be argued Hamas have
| been considerably strengthened. They certainly think so.
|
| It seems to me that Israel's overall position - politically,
| diplomatically, and in terms of physical security - has
| become much worse since the October 7 attack and it has been
| their own actions that are responsible for the change. A
| different response should have seen them politically and
| diplomatically strengthened.
|
| I understand the emotive reasons for not doing so, but I
| think most people would consider that Israel has bungled
| their response to October 7.
|
| I would call this attack on Gaza a total failure. If nothing
| else a failure of humanity.
|
| It's looking more and more like the 'winners' in this
| situation are Hamas and the losers are the Israeli
| government, the US government, and the Israeli and
| Palestinian people.
| YeGoblynQueenne wrote:
| I'm not sure Hamas has anything left to win. Gaza is in
| ruins. If things go on the way they are for very much
| longer, there won't even be left any Palestinians in Gaza,
| only Hamas in its tunnels. The lords of the underground...
| buried under the rubble. That's not a vision of victory.
| gizmondo wrote:
| > Given the total failure to achieve any of its stated
| objectives, has this use of AI benefited the IDF at all?
|
| Their invasion of the Gaza city went way better than expected
| by most analysts, with minimal casualties among Israeli. So
| probably? Hard to compare with the alternative reality where
| they select the targets the old way.
|
| That their stated objectives are likely unachievable is a
| different issue.
| bananapub wrote:
| perhaps apocryphal quote from IBM: "A COMPUTER
| CAN NEVER BE HELD ACCOUNTABLE THEREFORE A COMPUTER
| MUST NEVER MAKE A MANAGEMENT DECISION"
|
| it's sort of irrelevant if some shitty computer system is killing
| people - the people who need to be arrested are the people who
| allowed the shitty computer system to do that. we obviously
| cannot allow "oh, not my fault I chose to allow a computer to
| kill people" to be an excuse or a defence for murder or
| manslaughter or ... anything.
| nojvek wrote:
| US supporting Ukraine made sense, Russia was the clear aggresor.
|
| US supporting Israel makes very little sense.
|
| That being said, Trump signed bill to removed reporting of drone
| strikes by US military and he approved more strikes than Obama.
|
| So US likely has amplified systems compared to Lavender and
| Gospel. We'd have no idea.
|
| This season of Daily Show about AI comes to mind:
| https://www.youtube.com/watch?v=20TAkcy3aBY
|
| Everyone claiming AI is going to do great good, solve climate
| change yada yada is deeply in an illusion.
|
| AI will only amplify what corporations and state powers already
| do.
| FridgeSeal wrote:
| > "Where's Daddy?" also revealed here for the first time, were
| used specifically to track the targeted individuals and carry out
| bombings when they had entered their family's residences.
|
| That is appalling.
| frob wrote:
| That is genocide.
| pvaldes wrote:
| Very good name choice. An accurate combination of law and ender
| bythreads wrote:
| That seems like a very very political site judging from the other
| articles - also seems ai generated half of it - sure this holds
| up?
| gerash wrote:
| Is there a list of congress people who support sending our tax
| money to Israel?
| pphysch wrote:
| opensecrets.org has some of it, which is used by other groups
| like the @trackAIPAC Twitter/X account.
| gerash wrote:
| This practice is akin to physically and mentally abusing a puppy,
| let them grow into a fearful and aggressive dog then say: "what
| an aggressive dog ! they need to be euthanized"
| dist-epoch wrote:
| Meanwhile China is working on automated building facilities which
| can make 1,000 cruise missiles per day:
|
| https://twitter.com/Aryan_warlord/status/1774859594747273711
|
| Perfect match for a targeting AI, the AI could even customize
| each missile as it's being built according to the target it
| selected.
| firtoz wrote:
| > The following investigation is organized according to the six
| chronological stages of the Israeli army's highly automated
| target production in the early weeks of the Gaza war. First, we
| explain the Lavender machine itself, which marked tens of
| thousands of Palestinians using AI. Second, we reveal the
| "Where's Daddy?" system, which tracked these targets and signaled
| to the army when they entered their family homes. Third, we
| describe how "dumb" bombs were chosen to strike these homes.
|
| > Fourth, we explain how the army loosened the permitted number
| of civilians who could be killed during the bombing of a target.
| Fifth, we note how automated software inaccurately calculated the
| amount of non-combatants in each household. And sixth, we show
| how on several occasions, when a home was struck, usually at
| night, the individual target was sometimes not inside at all,
| because military officers did not verify the information in real
| time.
|
| Tbh this feels like making a machine that points at a random
| point on the map by rolling two sets of dice, and then yelling
| "more blood for the blood god" before throwing a cluster bomb
| Gud wrote:
| Holy shit if this is true. Who are +972mag and how reliable are
| they?
| screye wrote:
| Technology like this raises a moral conundrum.
|
| Minimizing deaths is the humane approach to war. So we move away
| from broad killing mechanisms (shelling, crude explosives, carpet
| bombing), in favor of precise killing machines. Drones, targeted
| missiles and now AI allow you to be ruthlessly efficient in
| killing an enemy.
|
| The question is - How cold and not-human-like can these methods
| be, if they are in fact reducing overall deaths ?
|
| I won't pretend an answre is obvious.
|
| The west hasn't seen a real war in a long time. Their impression
| of war is either ww1 style mass deaths on both sides or overnight
| annihilation like America's attempts in the middle east. So our
| vocabulary limits us to words like Genocide, Overthrow,
| Insurgency, etc. This is war. It might not map onto our
| intuitions from recent memory, but this is exactly what it looks
| like.
|
| When you're in a long drawn out war with a technological upper
| hand...you leverage all technology to help you win. At the same
| time, once pandoras box is open, it tends to stay open for your
| adversaries as well. We did well to maintain global consensus on
| chemical and nuclear warfare. I don't see any such concensus
| coming out of the AI era just yet.
|
| All I'll say is that I won't be quick to make judgements on the
| morality of such tech in war. What do you think happened to the
| spies that were caught due to decoding of the enigma ?
| asmallcat wrote:
| The sad and simple truth (trying to not sound political, but it's
| pretty damned hard given the context) is that it seems that not
| so long ago, lists and very flimsy justifications were at the
| root of a lot of pain and suffering for the very people
| perpetrating the same.
| cpcat wrote:
| What is the next article? AI launched a nuclear missile?
| 0x38B wrote:
| I expected more comments on the source's biases, given the
| contentious and sensitive topic; journalist Liel Leibovitz writes
| this about +972 Magazine (1):
|
| > Underlining everything +972 does is a dedication to promoting a
| progressive worldview of Israeli politics, advocating an end to
| the Israeli occupation of the West Bank, and protecting human and
| civil rights in Israel and Palestine.
|
| > And while the magazine's reported pieces--roughly half of its
| content--adhere to sound journalistic practices of news gathering
| and unbiased reporting, its op-eds and critical essays support
| specific causes and are aimed at social and political change.
|
| 1: https://www.tabletmag.com/sections/israel-middle-
| east/articl...
| luketaylor wrote:
| This article falls under "reported pieces," not "op-eds and
| critical essays"
| __lbracket__ wrote:
| Heartbreaking. I seriously wonder if Hamas expected this level of
| retaliation.
| nickdothutton wrote:
| I am reminded of Poindexter's[1] total information awareness
| project, which I thought at the time too interesting for it to
| wholly disappear. I must admit this knowledge influenced one or
| two of my own blog postings on what I call "Strategic
| Software"[2].
|
| [1]: https://en.wikipedia.org/wiki/Total_Information_Awareness
| [2]: https://blog.eutopian.io/tags/strategic-software/
| koutetsu wrote:
| As someone working in the AI field, I find this use of AI truly
| terrifying. Today it may be used to target Hamas and accept a
| relatively large number of civilian deaths as permissible
| collateral damage, but nothing guarantees that it won't be
| exported and used somewhere else. On top of that, I don't think
| anything is done to alleviate biases in the data (if you're used
| to target people from a certain group then your AI system will
| still target people from that group) or validate the predictions
| after a "target" is bombed. I wish there was more regulations for
| these use cases. Too bad the EU AI Act doesn't address military
| uses at all.
| onethought wrote:
| Given we don't know what it's using to identify people we don't
| really know any biases. "Holding a military weapon" probably
| doesn't contain a whole lot of bias (of course there is
| misidentification).
| carabiner wrote:
| It is so weird that the US is sending aid to help people harmed
| by US weapons.
| fhd2 wrote:
| > One source stated that human personnel often served only as a
| "rubber stamp" for the machine's decisions, adding that,
| normally, they would personally devote only about "20 seconds" to
| each target before authorizing a bombing [...]
|
| Brings the Ironies of Automation paper to mind:
| https://en.m.wikipedia.org/wiki/Ironies_of_Automation
|
| Specifically: If _most_ of a task is automated, human oversight
| becomes near useless. People get bored, are under time pressure,
| don't find enough mistakes etc and just don't do the review job
| they're supposed to do anymore.
|
| A dystopian travesty.
| jmyeet wrote:
| Unfortunately, Big Tech has been very effective in spreading a
| message that helps Israel maintain the plausible deniability that
| comes from a system like Lavender.
|
| For at least 15 years we've had personalized newsfeeds in social
| media. For even longer we've had search engine ranking, which is
| also personalized. Whenever criticism is levelled against Meta or
| Twitter or Google or whoever for the results on that ranking,
| it's simply blamed on "the algorithm". That serves the same
| purpose: to provide moral cover for human actions.
|
| We've seen the effects of direct human intervention in cases like
| Google Panda [1]. We also know that search engines and newsfeeds
| filter out and/or downrank objectionable content. That includes
| obvious categories (eg CSAM, anything else illegal) but it also
| includes value-based judgements on perfectly legitimate content
| (eg [2]).
|
| Lavender is Israel saying "the algorithm" decided what to strike.
|
| I want to put this in context. In ~20 years of the Vietnam War,
| 63 journalists were killed or lost )presumed dead) [3]. In the 6
| months since October 7, at least 95 journalists have been killed
| in Gaza [4]. In the years prior there were still a large number
| killed [5], famously including an American citizen Shireen abu-
| Akleh [6].
|
| None of this is an accident.
|
| My point here is that anyone who blames "the algorithm" or
| deflects to some ML system is purposely deflecting responsibility
| from the human actions that led to that and for that to continue
| to exist.
|
| [1]: https://en.wikipedia.org/wiki/Google_Panda
|
| [2]: https://www.hrw.org/report/2023/12/21/metas-broken-
| promises/...
|
| [3]:
| https://en.wikipedia.org/wiki/List_of_journalists_killed_and...
|
| [4]: https://cpj.org/2024/04/journalist-casualties-in-the-
| israel-...
|
| [5]:
| https://en.wikipedia.org/wiki/List_of_journalists_killed_dur...
|
| [6]: https://en.wikipedia.org/wiki/Killing_of_Shireen_Abu_Akleh
___________________________________________________________________
(page generated 2024-04-03 23:00 UTC)