[HN Gopher] More than 140 Kenya Facebook moderators sue after di...
___________________________________________________________________
More than 140 Kenya Facebook moderators sue after diagnoses of PTSD
Author : uxhacker
Score : 342 points
Date : 2024-12-18 15:20 UTC (4 days ago)
(HTM) web link (www.theguardian.com)
(TXT) w3m dump (www.theguardian.com)
| fouronnes3 wrote:
| Absolutely grim. I wouldn't wish that job on my worst enemy. The
| article reminded me of a Radiolab episode from 2018:
| https://radiolab.org/podcast/post-no-evil
| nappy-doo wrote:
| I worked at FB for almost 2 years. (I left as soon as I could, I
| knew it wasn't a good fit for me.)
|
| I had an Uber from the campus one day, and my driver, a twenty-
| something girl, was asking how to become a moderator. I told her,
| "no amount of money would be enough for me to do that job. Don't
| do it."
|
| I don't know if she eventually got the job, but I hope she
| didn't.
| narrator wrote:
| Yes, these jobs are horrible. However, I do know from
| accidently encountering bad stuff on the internet that you want
| to be as far away from a modern battlefield as possible.
|
| It's just kind of ridiculous how people think war is like Call
| of Duty. One minute you're sitting in a trench, the next you're
| a pile of undifferentiated blood and guts. Same goes for car
| accidents and stuff. People really underestimate how fragile we
| are as human beings. Becoming aware of this is super damaging
| to our concept of normal life.
| int_19h wrote:
| It's not that we're particularly fragile, given the kind of
| physical trauma human beings can survive and recover from.
|
| It's that we have technologically engineered things that are
| destructive enough to get even past that threshold. Modern
| warfare in particular is insanely energetic in the most
| literal, physical way - when you measure the energy output of
| weapons in joules. Partly because we're just that good at
| making things explode, and partly because improvements in
| metallurgy and electronics made it possible over time to
| locate targets with extreme precision in real time and then
| concentrate a lot of firepower directly on them. This, in
| particular, is why the most intense battlefields in Ukraine
| often look worse than WW1 and WW2 battles of similar
| intensity (e.g. Mariupol had more buildings destroyed than
| Stalingrad).
|
| But even our small arms deliver much more energy to the
| target than their historical equivalents. Bows and arrows
| pack ~150 J at close range, rapidly diminishing with
| distance. Crossbows can increase this to ~400 J. For
| comparison, an AK-47 firing standard issue military ammo is
| ~2000 J.
| llm_trw wrote:
| Watch how a group of wild dogs kill their prey, then
| realise that for milenia human like apes were part of their
| diet. Even the modern battlefield is more humane than the
| African savannah.
| ChrisMarshallNY wrote:
| That reminds me of this[0]. It's a segment of BBC's
| _Planet Earth_ , where a pack of Cape Hunting Dogs are
| filmed, hunting.
|
| It's almost military precision.
|
| [0] https://www.youtube.com/watch?v=MRS4XrKRFMA
| newsclues wrote:
| Humans can render other humans unrecognizable with a rock.
|
| Brutal murder is low tech.
| int_19h wrote:
| Not at scale.
| newsclues wrote:
| Armies scale up.
|
| It's like the original massive scale organization.
| paulryanrogers wrote:
| Scaling an army of rock swingers is a lot more work than
| giving one person an AK47 (when all who would oppose them
| have rocks).
|
| (Thankfully in the US we worship the 2A and its most
| twisted interpretation. So our toddlers do shooter
| drills. /s)
| newsclues wrote:
| You are discounting the complexity of the logistics
| required for an AK47 army. You need ammo, spare parts,
| lubricant and cleaning tools. You need a factory to build
| the weapon, and churn out ammunition.
|
| Or, gather a group of people, tell them to find a rock,
| and go bash the other sides head.
| com2kid wrote:
| > You need ammo, spare parts, lubricant and cleaning
| tools.
|
| The ak-47 famously only needs the first item in that
| list.
|
| That being the key to its popularity.
| int_19h wrote:
| It should be noted that the purported advantages of AK
| action over its competitors in this regard are rather
| drastically overstated in popular culture. E.g. take a
| look at these two vids showing how AK vs AR-15 handle
| lots of mud:
|
| https://www.youtube.com/watch?v=DX73uXs3xGU
|
| https://www.youtube.com/watch?v=YAneTFiz5WU
|
| As far as cleaning, AK, like many guns of that era,
| carries its own cleaning & maintenance toolkit inside the
| gun. Although it is a bit unusual in that regard in that
| this kit is, in fact, sufficient to remove _any_ part of
| the gun that is not permanently attached. Which is to
| say, AK can be serviced in the field, without an armory,
| to a greater extent than most other options.
|
| But the main reason why it's so popular isn't so much
| because of any of that, but rather because it's very
| cheap to produce _at scale_ , and China especially has
| been producing millions of AKs specifically to dump them
| in Africa, Middle East etc. But where large quantities of
| other firearms are available for whatever reason, you see
| them used just as much - e.g. Taliban has been rocking a
| lot of M4 and M16 since US left a lot of stocks behind.
| int_19h wrote:
| Complexity of logistics applies to any large army. The
| single biggest limiting factor for most of history has
| been the need to either carry your own food, or find it
| in the field. This is why large-scale military violence
| requires states.
| graemep wrote:
| > Humans can render other humans unrecognizable with a
| rock.
|
| They are much less likely to.
|
| We have instinctive repulsion to violence, especially
| extending it (e.g. if the rock does not kill at the first
| blow).
|
| It is much easier to kill with a gun (and even then
| people need training to be willing to do it), and easier
| still to fire a missile at people you cannot even see.
| meiraleal wrote:
| Than throwing a face punch or a rock? You should check
| public schools.
| ternnoburn wrote:
| Than _killing_ with bare hands or a rock, which I believe
| is still pretty uncommon in schools.
| meiraleal wrote:
| GP didn't talk about killing
| graemep wrote:
| Extreme violence then? With rocks, clubs of bare hands? I
| was responding to "render other humans unrecognizable
| with a rock" which I am pretty sure is uncommon in
| schools.
| ternnoburn wrote:
| Render unrecognizable? Yeah, I guess that could be
| survivable, but it's definitely lethal intent.
| meiraleal wrote:
| That's possible with just a well placed punch to the nose
| or to one of the eyes. I've seen and done that, in public
| schools.
| graemep wrote:
| Not in public schools in the British sense. I assume it
| varies in public schools in the American sense, and I am
| guessing violence sufficient to render someone
| unrecognisable is pretty rare even in the worst of them.
| Dalewyn wrote:
| >Crossbows can increase this to ~400 J.
|
| Funny you mention crossbows; the Church at one point in
| time tried to ban them because they democratized violence
| to a truly trivial degree. They were the nuclear bombs and
| assault rifles of medieval times.
|
| Also, I will take this moment to also mention that the
| "problem" with weapons always seem to be how quickly they
| can kill rather than the killing itself. Kind of takes away
| from the discussion once that is realized.
| noduerme wrote:
| Watching someone you love die of cancer is also super
| damaging to one's concept of normal life. _Getting_ a
| diagnosis, or _being_ in a bad car accident, or the victim of
| a violent assault is, too. I think a personal sense of
| normality is nothing more than the state of mind where we can
| blissfully (and temporarily) forget about our own mortality.
| Obviously, marinating yourself in all the horrible stuff
| makes it really hard to maintain that state of mind.
|
| On the other hand, never seeing or reckoning with or
| preparing for how brutal reality actually is can lead to a
| pretty bad shock once something bad happens around you. And
| maybe worse, can lead you to under-appreciate how fantastic
| and beautiful the quotidian moments of your normal life
| actually are. I think it's important to develop a concept of
| normal life that doesn't completely ignore that really bad
| things happen all around us, all the time.
| karlgkk wrote:
| Frankly
|
| there's a difference between a one or two or even ten off
| exposure to the brutality of life, where various people in
| your life will support you and help you acclimate to it
|
| Versus straight up mainlining it for 8 hours a day
| stephenitis wrote:
| hey kid, hope you're having a good life. I'll look at the
| screen full of the worst the internet and humanity has
| produced on the internet for eight hours.
|
| I get your idea but in the context of this topic I think
| you're overreaching
| keybored wrote:
| ... okay.
|
| Emergency personnel might need to braze themselves for car
| accidents every day. That Kenyans need to be traumatized by
| Internet Content in order to make a living is just silly
| and unnecessary.
| ashoeafoot wrote:
| ISISomalia loves that recruitment pool though
| portaouflop wrote:
| Car "accidents" are also completely unnecessary.
|
| Even the wording is wrong - those aren't accidents, it is
| something we accept as byproduct of a car-centric
| culture.
|
| People feel it is acceptable that thousands of people die
| on the road so we can go places faster. Similarly they
| feel it's acceptable to traumatise some foreigners to
| keep social media running.
| keybored wrote:
| Nitpick that irrelevant example if you want.
| Der_Einzige wrote:
| Actually reckoning with this stuff leads people into
| believing in anti-natalism, negative utilitarinism,
| Scopenhaur/Philipp Mainlander (Mainlander btw was not just
| pro-suicide, he actually killed himself!), and the
| voluntary extinction movement. This terrified other
| philosophers like Nietzsche, who spends most of his work
| defending reality even if it's absolute shit. "Amor Fati",
| "Infinite Regress/Eternal Recurrence", "Ubermensch" vs the
| literal "Last Man". "Wall-E" of all films was the modern
| quintessential nietzschian fable, with maybe "Children of
| Men" being the previous good one before that.
|
| You're literally not allowed to acknowledge that this stuff
| is bad and adopt one of the religions that see this and try
| to remove suffering - i.e. Jainism, because _at least
| historically_ doing so meant you couldn 't use violence in
| any circumstances, which also meant that your neighbor
| would murder you. There's a reason that Jain's population
| are in the low millions
|
| Reality is actually bad, and it should be far more
| intuitive to folks. The fact that positive experience is
| felt "quickly" and negative experience is felt "slowly" was
| all the evidence I needed that I wouldn't just press the
| "instantly and painlessly and without warning destroy
| reality" (benevolent world-exploder) button, I'd smash it!
| keybored wrote:
| Interesting to see this perspective here. You're not
| wrong.
|
| > There's a reason that Jain's population are in the low
| millions
|
| The two largest Vedic religions both have hundreds of
| millions of followers. Is Jainism _that_ different from
| them in this regard? I know Jainism is very pacifist but
| on the question of _suffering_.
| bowsamic wrote:
| I felt this way for the first 30 years of my life. Then I
| received treatment for depression (psychoanalysis) and
| finally tasted joy for the first time in my entire life.
| Now I love life. YMMV
|
| EDIT: If you're interested what actually happened is that
| I was missing the prerequisite early childhood experience
| that enables one to feel secure in reality. If you check,
| all the people who have this feeling of
| philosophical/ontological pessimism have a missing or
| damaged relationship with the mother in the first year or
| so. For them, not even Buddhism can help, since even the
| abstract idea of anything good, even if it requires
| transcendence, is a joke
| doublerabbit wrote:
| One does not fully-experience life until you encounter a
| death of something you care about. It being a pet, person;
| nothing gives you that real sense of reality until your true
| feelings are challenged.
|
| I used to live in the Disney headspace until my dog had to be
| put down. Now with my parents being in their seventies, and
| me in my thirties I fear losing them the most as the feeling
| of losing my dog was hard enough.
| batch12 wrote:
| That's the tragic consequence of being human. Either the
| people you care about leave first or you do, but in the
| end, everyone goes. We are blessed and cursed with the
| knowledge to understand this. We should try to maximize the
| time we spend with those that are important to us.
| wholinator2 wrote:
| Well, i think it goes to a point. I'd imagine there's some
| goldilocks zone of time spent with the animal, care
| experienced from the animal, dependence on the animal, and
| manner/speed of death/ time spent watching the thing die.
|
| I say animal to explicitly include humans. Finding my
| hamster dead in fifth grade did change me. But watching my
| mother slowly die a horrible, haunting death didn't make me
| a better person. I'm just saying that there's a spectrum
| that goes something like: easy to forget about, I'm able to
| not worry, sometimes i think about it when i dont want,
| often i think about it, often it bothers me, and do on. You
| can probably imagine the cycle of obsession and stress.
|
| This really goes for all traumatic experiences. There's a
| point where they can make you a better person, but there's
| a cliff after which you have no guarantees that it won't
| just start obliterating you and your life. It's still a
| kind of perspective. But can you have too much perspective?
| Lots of times i feel like i do
| FireBeyond wrote:
| Speaking as a paramedic, two things come to mind:
|
| 1) I don't have squeamishness about trauma. In the end, we
| are all blood and tissue. The calls that get to me are the
| emotionally traumatic, the child abuse, domestic violence,
| elder abuse (which of course often have a physical component
| too, but it's the emotional for me), the tragic, often
| preventable accidents.
|
| 2) There are many people, and I get the curiosity, that will
| ask "what's the worst call you've been on?" - one, you don't
| really want to hear, and two, "Hey, person I may barely know,
| do you think you can revisit something traumatic for my
| benefit/curiosity?"
| BobaFloutist wrote:
| It's also super easy to come up with better questions:
| "What's the funniest call you've ever been on?" "What call
| do you feel like you made the biggest difference?" "What's
| the best story you have?"
| Modified3019 wrote:
| That's an excellent way to put it, resonates with my (non
| medical) experience. It's the emotional stuff that will try
| to follow me around and be intrusive.
|
| I won't watch most movies or TV because they are just some
| sort of tragedy porn.
| coliveira wrote:
| > movies or TV because they are just some sort of tragedy
| porn
|
| 100% agree. Most TV series nowadays are basically
| violence porn, now that real porn is not allowed for all
| kinds of reasons.
| ocschwar wrote:
| I'd be asking "how bad is the fentanyl situation in your
| are?"
| FireBeyond wrote:
| Relatively speaking, not particularly.
|
| What's interesting now is how many patients will say
| "You're not going to give me fentanyl are you? That's
| really dangerous stuff", etc.
|
| Their perfect right, of course, but is sad that that's
| the public perception - it's extremely effective, and
| quite safe, used properly (for one, we're obviously only
| giving it from pharma sources, with actually properly
| dosed solutions for IV).
| nradov wrote:
| I don't mean to trivialize traumatic experiences but I think
| many modern people, especially the pampered members of the
| professional-managerial class, have become too disconnected
| from reality. Anyone who has hunted or butchered animals is
| well aware of the fragility of life. This doesn't damage our
| concept of normal life.
| Eisenstein wrote:
| What is it about partaking in or witnessing the killing of
| animals or humans that makes one more connected to reality?
| AnarchismIsCool wrote:
| Lots of people who spend time working with livestock on a
| farm describe a certain acceptance and understanding of
| death that most modern people have lost.
| Eisenstein wrote:
| Are farmers more willing to discuss things like end of
| life medical decisions?
|
| Are they more amenable to terminally ill people having
| access to euthanasia?
|
| Do they cope better after losing loved ones?
|
| Are there other ways we can get a sense of how a more
| healthy acceptance of mortality would manifest?
|
| Would be interested in this data if it is available.
| s1artibartfast wrote:
| I don't have any data, but my anecdotal experience is a
| yes to those questions.
|
| >Are there other ways we can get a sense of how a more
| healthy acceptance of mortality would manifest?
|
| In concept, yes, I think home family death can also have
| a similar impact. It is not very common in the US, but 50
| years ago, elders would typically die at home with
| family. There are cultures today, even materially
| advanced ones, where people spend time with the freshly
| dead body of loved ones instead of running from it and
| compartmentalizing it.
| Dalewyn wrote:
| In Japan, some sushi bars keep live fish in tanks that
| you can order to have served to you as sushi/sashimi.
|
| The chefs butcher and serve the fish right in front of
| you, and because it was alive merely seconds ago the meat
| will still be twitching when you get it. If they also
| serve the rest of the fish as decoration, the fish might
| still be gasping for oxygen.
|
| Japanese don't really think much of it, they're used to
| it and acknowledge the fleeting nature of life and that
| eating something means you are taking another life to
| sustain your own.
|
| The same environment will likely leave most westerners
| squeamish or perhaps even gag simply because the west
| goes out of its way to hide where food comes from, even
| though that simply is the reality we all live in.
|
| Personally, I enjoy meats respecting and appreciating the
| fact that the steak or sashimi or whatever in front of me
| was a live animal at one point just like me. Salads too,
| those vegetables were (are?) just as alive as I am.
| Eisenstein wrote:
| If I were to cook a pork chop in the kitchen of some of
| my middle eastern relatives they would feel sick and
| would probably throw out the pan I cooked it with (and me
| from their house as well).
|
| Isn't this similar to why people unfamiliar with that
| style of seafood would feel sick -- cultural views on
| what is and is not normal food -- and not because of
| their view of mortality?
| Dalewyn wrote:
| You're not grasping the point, which I don't necessarily
| blame you.
|
| Imagine that to cook that pork chop, the chef starts by
| butchering a live pig. Also imagine that he does that in
| view of everyone in the restaurant rather than in the
| "backyard" kitchen let alone a separate butchering
| facility hundreds of miles away.
|
| That's the sushi chef butchering and serving a live fish
| he grabbed from the tank behind him.
|
| When you can actually see where your food is coming from
| and what "food" truly even is, that gives you a better
| grasp on reality and life.
|
| It's also the true meaning behind the often used joke
| that goes: "You don't want to see how sausages are made."
| Eisenstein wrote:
| I grasp the point just fine, but you haven't convinced me
| that it is correct.
|
| The issue most people would have with seeing the sausage
| being made isn't necessarily watching the slaughtering
| process but with seeing pieces of the animal used for
| food that they would not want to eat.
| Dalewyn wrote:
| But isn't that the point? If someone is fine eating
| something so long as he is ignorant or naive, doesn't
| that point to a detachment from reality?
| Eisenstein wrote:
| I wouldn't want to eat a cockroach regardless of whether
| I saw it being prepared or not. The point I am making is
| that 'feeling sick' and not wanting to eat something
| isn't about being disconnected from the food. Few people
| would care if you cut off a piece of steak from a hanging
| slab and grilled it in front of them, but would find it
| gross to pick up all the little pieces of gristle and
| organ meat that fell onto the floor, grind it all up,
| shove it into an intestine, and cook it.
| ImPostingOnHN wrote:
| _> Few people would care if you cut off a piece of steak
| from a hanging slab_
|
| The analogy here would be watching a live cow get
| slaughtered and then butchered from scratch in front of
| you, which I think most Western audiences (more than a
| few) might not like.
| abduhl wrote:
| Most audiences wouldn't like freshly butchered cow -
| freshly butchered meat is tough and not very flavorful,
| it needs to be aged to allow it to tenderize and develop.
| ImPostingOnHN wrote:
| The point is that most Western audiences would likely
| find it unpleasant to be there for the slaughtering and
| butchering from scratch.
| Dalewyn wrote:
| That the point is being repeated to no effect ironically
| illustrates how most modern people (westerners?) are
| detached from reality with regards to food.
| Eisenstein wrote:
| To me, the logical conclusion is that they don't agree
| with your example and think that you are making
| connections that aren't evidenced from it.
|
| I think you are doing the same exact thing with the above
| statement as well.
| Dalewyn wrote:
| In the modern era, most of the things the commons come
| across have been "sanitized"; we do a really good job of
| hiding all the unpleasant things. Of course, this means
| modern day commons have a fairly skewed "sanitized"
| impression of reality who will get shocked awake if or
| when they see what is usually hidden (eg: butchering of
| food animals).
|
| That you insist on contriving one zany situation after
| another instead of just admitting that people today are
| detached from reality illustrates my point rather
| ironically.
|
| Whether it's butchering animals or mining rare earths or
| whatever else, there's a _lot_ of disturbing facets to
| reality that most people are blissfully unaware of.
| Ignorance is bliss.
| abduhl wrote:
| To be blunt, the way you express yourself on this topic
| comes off as very "enlightened intellectual." It's clear
| that you think that your views/assumptions are the
| correct view and any other view is one held by the
| "commons"; one which you can change simply by providing
| the poor stupid commons with your enlightened knowledge.
|
| Recall that this whole thread started with your
| proposition that seeing live fish prepared in front of
| someone "will likely leave most westerners squeamish or
| perhaps even gag simply because the west goes out of its
| way to hide where food comes from, even though that
| simply is the reality we all live in." You had no basis
| for this as far as I can tell, it's just a random musing
| by you. A number of folks responded disagreeing with you,
| but you dismissed their anecdotal comments as being wrong
| because it doesn't comport with your view of the unwashed
| masses who are, obviously, feeble minded sheep who
| couldn't possibly cope with the realities of modern food
| production in an enlightened way like you have whereby
| you "enjoy meats respecting and appreciating the fact
| that the steak or sashimi or whatever in front of me was
| a live animal at one point just like me." How noble of
| you. Nobody (and I mean this in the figurative sense not
| the literal sense) is confused that the slab of meat in
| front of them was at one point alive.
|
| Then you have the audacity to accuse someone of coming up
| with "zany" situations? You're the one that started the
| whole zany discussion in the first place with your own
| zany musings about how "western" "commons" think!
| Eisenstein wrote:
| A cow walks into the kitchen, it gets a captive bolt
| shoved into its brain with a person holding a compressed
| air tank. Its hide is ripped off and it is cut into two
| pieces with all of its guts on the ground and the flesh
| and bones now hang as slabs.
|
| I am asserting that you could do all of that in front of
| a random assortment of modern Americans, and then cut
| steaks off of it and grill them and serve them to half of
| the crowd, and most of those people would not have an
| problem eating those steaks.
|
| Then if you were to scoop up all the leftover, non-steak
| bits from the ground with shovels, throw it all into a
| giant meat grinder and then take the intestines from a
| pig, remove the feces from them and fill them with the
| output of the grinder, cook that and serve it to the
| other half of the crowd, then a statistically larger
| proportion of that crowd would not want to eat that
| compared to the ones who ate the steak.
| kenjackson wrote:
| You're just going down the list of things that sound
| disgusting. The second sounds worse than the first but
| both sound horrible.
| ImPostingOnHN wrote:
| _> I am asserting that you could do all of that in front
| of a random assortment of modern Americans, and then cut
| steaks off of it and grill them and serve them to half of
| the crowd, and most of those people would not have an
| problem eating those steaks._
|
| I am asserting that the majority of western audiences,
| including Americans, would dislike being present for the
| slaughtering and butchering portion of the experience you
| describe.
| sensanaty wrote:
| I grew up with my farmer grandpa who was a butcher, and
| I've seen him butcher lots of animals. I always have and
| probably always will find tongues & brains disgusting,
| even though I'm used to seeing how the sausage is made
| (literally).
|
| Some things just tickle the brain in a bad way. I've
| killed plenty of fish myself, but I still wouldn't want
| to eat one that's still moving in my mouth, not because
| of ickiness or whatever, but just because the concept is
| unappealing. I don't think this is anywhere near as
| binary as you make it seem, really.
| caymanjim wrote:
| Plenty of westerners are not as sheltered from their food
| as you. Have you never gone fishing and watched your
| catch die? Have you never boiled a live crab or lobster?
| You've clearly never gone hunting.
|
| Not to mention the millions of Americans working in the
| livestock and agriculture business who see up close every
| day how food comes to be.
|
| A significant portion of the American population engages
| directly with their food and the death process. Citing
| one gimmicky example of Asian culture where squirmy
| seafood is part of the show doesn't say anything about
| the culture of entire nations. That is not how the
| majority of Japanese consume seafood. It's just as
| anomalous there. You only know about it because it's
| unusual enough to get reported.
|
| You can pick your lobster out of the tank and eat it at
| American restaurants too. Oysters and clams on the half-
| shell are still alive when we eat them.
| Dalewyn wrote:
| >Plenty of westerners are not as sheltered from their
| food as you. ... You only know about it because it's
| unusual enough to get reported.
|
| In case you missed it, you're talking to a Japanese.
|
| Some restaurants go a step further by letting the
| customers literally fish for their dinner out of a pool.
| Granted _those_ restaurants are a niche, that 's their
| whole selling point to customers looking for something
| different.
|
| Most sushi bars have a tank holding live fish and other
| seafood of the day, though. It's a pretty mundane thing.
| paganel wrote:
| My brother, an Eastern-European part-time farmer and full-
| time lorry driver, just texted me a couple of hours ago (I
| had told him I would call him in the next hour) that he
| might be with his hands full of meat by that time as "we've
| just butchered our pig Ghitza" (those sausages and _piftii_
| aren't going to get made by themselves).
|
| Now, ask a laptop worker to butcher an animal whom used to
| have a name and to literally turn its meat into sausages
| and see what said worker's reaction would be.
| sandworm101 wrote:
| >> ridiculous how people think war is like Call of Duty.
|
| It is also ridiculous how people think every soldier's
| experience is like Band of Brothers or Full Metal Jacket. I
| remember an interview with a WWII vet who had been on omaha
| beach: "I don't remember anything happening in slow motion
| ... I do remember eating a lot of sand." The reality of war
| is often just not visually interesting enough to put on the
| screen.
| HPsquared wrote:
| I'm no longer interested in getting a motorcycle, for similar
| reasons.
| spacechild1 wrote:
| I spent my civil service as a paramedic assistent at the
| countryside, close to a mountainroad that was very popular
| with bikers. I was never interested in motorbikes in the
| first place, but the gruesome accidents I've witnessed
| turned me off for good.
| ocschwar wrote:
| The Venn diagram for EMTs, paramedics, and motorbikes is
| disjoint.
| zmgsabst wrote:
| You're only about 20x as likely to die on a motorcycle as
| in a car.
|
| What can I say? People like to live dangerously.
| alamortsubite wrote:
| Yes, but you're also far less likely to kill other people
| on a motorcycle as in a car (and even less, as in an SUV
| or pick-up truck). So some people live much less
| dangerously with respect to the people around them.
| HPsquared wrote:
| I suppose 20x a low number is still pretty low,
| especially given that number includes the squid factor.
| andrepd wrote:
| I'm pretty sure watching videos on /r/watchpeopledie or rekt
| threads on 4chan has been a net positive for me. I'm keenly
| aware of how dangerous cars are, that wars (including
| narcowars) are hell, that I should never stay close to a bus
| or truck as a pedestrian or bycicle, that I should never get
| into a bar fight... And that I'm very very lucky that I was
| not born in the 3rd world.
| gosub100 wrote:
| I get more upset watching people lightly smack and yell at
| each other on public freakout than I do watching people
| die. It's not that I don't care about the dead either, I
| watched wpd and similar sites for years. I didn't enjoy
| watching it, but I liked knowing the reality of what was
| going on in the world, and how each one of us has the
| capacity to commit these atrocities. I'm still doing a
| lousy job at describing why I like to watch it. But I do.
| mdanger007 wrote:
| Street fight videos, where the guy recording is Hooting
| and egging people on are disgusting
| Shorel wrote:
| > Becoming aware of this is super damaging to our concept of
| normal life.
|
| Not being aware of this is also a cause of traffic accidents.
| People should be more careful driving.
| portaouflop wrote:
| Normal does not exist - it's just the setting on your washing
| machine.
| LeftHandPath wrote:
| Earlier this year, I was at ground zero of the Super Bowl
| parade shooting. I didn't ever dream about it, but I spent the
| following 3-4 days constantly replaying it in my head in my
| waking hours.
|
| Later in the year I moved to Florida, just in time for Helene
| and Milton. I didn't spend much time thinking about either of
| them (aside from during prep and cleanup and volunteering a few
| weeks after). But I had frequent dreams of catastrophic storms
| and floods.
|
| Different stressors affect people (even myself) differently.
| Thankfully I've never had a major/long-term problem, but I know
| my reactions to major life stressors never seemed to have any
| rhyme or reason.
|
| I can imagine many people might've been through a few things
| that made them confident they'd be alright with the job, only
| to find out dealing with that stuff 8 hours a day, 40 hours a
| week is a whole different ball game.
| sandworm101 wrote:
| A parade shooting is bad, very bad, but is still tame
| compared to the sorts of things to which website moderators
| are exposed on a daily/hourly basis. Footage of people being
| shot is actually allowed on many platforms. Just think of all
| the war footage that is so common these days. The dark stuff
| that moderators see is way way worse.
| wkat4242 wrote:
| > Footage of people being shot is actually allowed on many
| platforms.
|
| It's also part of almost every American cop and military
| show and movie. Of course it's not real but it looks the
| same.
| guerrilla wrote:
| > Of course it's not real but it looks the same.
|
| I beg to differ. TV shows and movies are silly. Action
| movies are just tough-guy dancing.
| jnwatson wrote:
| "Tough guy dancing" is such an apt phrase.
|
| The organizer is even called a "fight choreographer".
| wkat4242 wrote:
| I mean more the gory parts. Blood, decomposed bodies
| everywhere etc.
|
| And I wasn't talking about action hero movies.
| consumer451 wrote:
| I have often wondered what would happen if social product orgs
| required all dev and product team members to temporarily rotate
| through moderation a couple times a year.
| Teever wrote:
| Yeah I've wondered the same thing about jobs in general too.
|
| Society would be a very different place if everyone had to do
| customer service or janitorial work one weekend a month.
| ANewFormation wrote:
| Many (all?) Japanese schools don't have janitors. Instead
| students clean on rotation. Never been much into Japanese
| stuff but I absolutely admire this about their culture, and
| imagine it's part of the reason that Japan is such a clean
| and at least superficially respectful society.
|
| Living in other Asian nations where there are often defacto
| invisible caste systems can be nauseating at times - you
| have parents that won't allow their children to participate
| in clean up efforts because their child is 'above handling
| trash.' That's gonna be one well adjusted adult...
| alex-korr wrote:
| I can tell you that back when I worked as a dev for the
| department building order fulfillment software at a dotcom,
| my perspective on my own product has drastically changed
| after I had spent a month at a warehouse that was shipping
| orders coming out of the software we wrote. Eating my own dog
| food was not pretty.
| neilv wrote:
| If I was a tech billionaire, and there was so much uploading of
| stuff so bad, that it was giving my employee/contractors PTSD, I
| think I'd find a way to stop the perpetrators.
|
| (I'm not saying that I'd assemble a high-speed yacht full of
| commandos, who travel around the world, righting wrongs when no
| one else can. Though that would be more compelling content than
| most streaming video episodes right now. So you could offset the
| operational costs a bit.)
| DiggyJohnson wrote:
| How else would you stop the perpetrators?
| abdullahkhalids wrote:
| Large scale and super sick perpetrators exist (as compared to
| small scale ones who do mildly sick stuff) because Facebook
| is a global network and there is a benefit to operating on
| such a large platform. The sicker you are, while getting away
| with it, the more reward you get.
|
| Switch to a federated social systems like Mastodon, with only
| a few thousand or ten thousand users per instance, and
| perpetrators will never be able to grow too large. Easy for
| the moderators to shut stuff down very quickly.
| shadowgovt wrote:
| Tricky. It also gives perpetrators a lot more places to
| hide. I think the jury is out on whether a few centralized
| networks or a fediverse makes it harder for attackers to
| reach potential targets (or customers).
| abdullahkhalids wrote:
| The purpose of facebook moderators (besides legal
| compliance) is to protect normal people from the "sick"
| people. In a federated network, of course, such people
| will create their own instances, and hide there. But then
| no one is harmed from them, because all such instances
| will be banned quite quickly, same as all spam email
| hosts are blocked very quickly by everyone else.
|
| From a normal person perspective on not seeing bad stuff,
| the design of a federated network is inherently better
| than a global network.
| shadowgovt wrote:
| That's the theory. I'm not sure yet that it works in
| practice, I've seen a lot of people on Mastodon
| complaining about how as a moderator, keeping up with the
| bad services is a perpetual game of whack-a-mole because
| everything is access on by default. Maybe this is a
| Mastodon specific issue.
| abdullahkhalids wrote:
| That's because Mastodon or any other federated social
| network hasn't taken off, and so not enough development
| has gone into them. If they take off, naturally people
| will develop analogs of spam lists and SpamAssassin etc
| for such systems, which will cut down moderation time
| significantly. I run an org email server, and don't
| exactly do any thing besides installing such automated
| tools.
|
| On Mastodon, admins will just have to do the additional
| work to make sure new accounts are not posting weird
| stuff.
| mu53 wrote:
| Big tech vastly underspends on this area. You can find a
| stream of articles from the last 10 years where BigTech
| companies were allowing open child prostitution, paid-for
| violence, and other stuff on their platforms with little
| to no moderation.
| throwie21873 wrote:
| > Switch to a federated social systems like Mastodon, with
| only a few thousand or ten thousand users per instance, and
| perpetrators will never be able to grow too large.
|
| The #2 and #3 most popular Mastodon instances allow CSAM.
| thrance wrote:
| If you were a tech billionaire you'd be a sociopath like the
| others and wouldn't give a single f about this. You'd be going
| on podcasts to tell the world that markets will fix everything
| if given the chance.
| richrichie wrote:
| They are not wrong. Do you know any mechanism other than
| markets that work at scale and that don't cost a bomb and
| don't involve abusive central authority?
| thrance wrote:
| Tech billionaires usually advocate for some kind of return
| to the gilded age, with minimal workers rights and
| corporate tax. Markets were freer back then, how did that
| work out for the average man? Markets alone don't do
| anything for the average quality of life.
| richrichie wrote:
| Quality of life for the average man now is way better
| than it was at any time in history. A fact.
| thrance wrote:
| But is it solely because of markets? Would deregulation
| improve our lives further? I don't think so, and that is
| what I am talking about. Musk, Bezos, Andreessen and cie.
| are advocating for a particular _laissez-faire_ flavor of
| capitalism, which historically has been very bad for the
| average man.
| para_parolu wrote:
| One of few fields where AI is very welcome
| itake wrote:
| Maybe.. apple had a lot of backlash using AI to detect CSAM.
| Cyph0n wrote:
| Wasn't the backlash due to the fact that they were running
| detection on device against your private library?
| threeseed wrote:
| Yes. As opposed to running it on their servers like they do
| now.
|
| And it was only for iCloud synced photos.
| Zak wrote:
| There's a huge gap between "we will scan our servers for
| illegal content" and "your device will scan your photos
| for illegal content" no matter the context. The latter
| makes the user's device disloyal to its owner.
| aaomidi wrote:
| And introduces avenues for state actors to force the
| scanning of other material.
|
| This was also during a time where Apple hadn't pushed out
| e2ee for iCloud, so it didn't even make sense.
| shadowgovt wrote:
| This ship has pretty much sailed.
|
| If you are storing your data in a large commercial
| vendor, assume a state actor is scanning it.
| kotaKat wrote:
| I'm shocked at the amount of people I've seen on my local
| news getting arrested lately for it and it all comes from
| the same starting tip:
|
| "$service_provider sent a tip to NCMEC" or "uploaded a
| known-to-NCMEC hash", ranging from GMail, Google Drive,
| iCloud, and a few others.
|
| https://www.missingkids.org/cybertiplinedata
|
| "In 2023, ESPs submitted 54.8 million images to the
| CyberTipline of which 22.4 million (41%) were unique. Of
| the 49.5 million videos reported by ESPs, 11.2 million
| (23%) were unique."
| shadowgovt wrote:
| And, indeed, this is why we should not expect the process
| to stop. Nobody is rallying behind the rights of child
| abusers and those who traffic in child abuse material.
| Arguably, nor should they. The slippery slope argument
| only applies if the slope is slippery.
|
| This is analogous to the police's use of genealogy and
| DNA data to narrow searches for murderers, who they then
| collected evidence on by other means. There's is risk
| there, but (at least in the US) you aren't going to find
| a lot of supporters of the anonymity of serial killers
| and child abusers.
|
| There are counter-arguments to be made. Germany is
| skittish about mass data collection and analysis because
| of their perception that it enabled the Nazi war machine
| to micro-target their victims. The US has no such
| cultural narrative.
| tzs wrote:
| > And, indeed, this is why we should not expect the
| process to stop. Nobody is rallying behind the rights of
| child abusers and those who traffic in child abuse
| material. Arguably, nor should they.
|
| I wouldn't be so sure.
|
| When Apple was going to introduce on-device scanning they
| actually proposed to do it in _two_ places.
|
| * When you uploaded images to your iCloud account they
| proposed scanning them on your device first. This is the
| one that got by far the most attention.
|
| * The second was to scan incoming messages on phones that
| had parental controls set up. The way that would have
| worked is:
|
| 1. if it detects sexual images it would block the
| message, alert the child that the message contains
| material that the parents think might be harmful, and ask
| the child if they still want to see it. If the child says
| no that is the end of the matter.
|
| 2. if the child say they do want to see it _and_ the
| child is at least 13 years old, the message is unblocked
| and that is the end of the matter.
|
| 3. if the child says they do want to see it _and_ the
| child is under 13 they are again reminded that their
| parents are concerned about the message, again asked if
| they want to view it, and told that if they view it their
| parents will be told. If the child says no that is the
| end of the matter.
|
| 4. If the child says yes the message is unblocked and the
| parents are notified.
|
| This second one didn't get a lot of attention, probably
| because there isn't really much to object to. But I did
| see one objection from a fairly well known internet
| rights group. They objected to #4 on the grounds that the
| person sending the sex pictures to your under-13 year old
| child sent the message to the child, so it violates the
| sender's privacy for the parents to be notified.
| FabHK wrote:
| The choice was between "we will upload your pictures
| unencrypted and do with them as we like, including scan
| them for CSAM" vs. "we will upload your pictures
| encrypted and keep them encrypted, but will make sure
| beforehand on your device only that there's no known CSAM
| among it".
| ImPostingOnHN wrote:
| _> we will upload your pictures unencrypted and do with
| them as we like_
|
| Curious, I did not realize Apple sent themselves a copy
| of all my data, even if I have no cloud account and don't
| share or upload anything. Is that true?
| itake wrote:
| Apple doesn't do this. But other service providers do
| (Dropbox, Google, etc).
|
| Other service providers can scan for CSAM from the cloud,
| but Apple cannot. So Apple might be one of the largest
| CSAM hosts in the world, due to this 'feature'.
| itake wrote:
| Apple is already categorizing content on your device.
| Maybe they don't report what categories you have. But I
| know if I search for "cat" it will show me pictures of
| cats on my phone.
| llm_trw wrote:
| Apple had a lot of backlash by using AI to scan every photo
| you ever took and sending it back to the mothership for more
| training.
| sneak wrote:
| No, they had backlash against using AI on devices they don't
| own to report said devices to police for having illegal files
| on them. There was no technical measure to ensure that the
| devices being searched were only being searched for CSAM, as
| the system can be used to search for any type of images
| chosen by Apple or the state. (Also, with the advent of
| GenAI, CSAM has been redefined to include generated imagery
| that does not contain any of {children, sex, abuse}.)
|
| That's a very very different issue.
|
| I support big tech using AI models running on their own
| servers to detect CSAM on their own servers.
|
| I do not support big tech searching devices they do not own
| in violation of the wishes of the owners of those devices,
| simply because the police would prefer it that way.
|
| It is especially telling that iCloud Photos is not end to end
| encrypted (and uploads plaintext file content hashes even
| when optional e2ee is enabled) so Apple can and does scan
| 99.99%+ of the photos on everyone's iPhones serverside
| already.
| threeseed wrote:
| > they don't own to report said devices to police for
| having illegal files on them
|
| They do this today. https://www.apple.com/child-
| safety/pdf/Expanded_Protections_...
|
| Every photo provider is required to report CSAM violations.
| samatman wrote:
| Actually they do not.
|
| https://forums.appleinsider.com/discussion/238553/apple-
| sued...
| skissane wrote:
| > Also, with the advent of GenAI, CSAM has been redefined
| to include generated imagery that does not contain any of
| {children, sex, abuse}
|
| It hasn't been redefined. The legal definition of it in the
| UK, Canada, Australia, New Zealand has included computer
| generated imagery since at least the 1990s. The US Congress
| did the same thing in 1996, but the US Supreme Court ruled
| in the 2002 case of _Ashcroft v Free Speech Coalition_ that
| it violated the First Amendment. [0] This predates GenAI
| because even in the 1990s people saw where CGI was going
| and could foresee this kind of thing would one day be
| possible.
|
| Added to that: a lot of people misunderstand what that 2002
| case held. SCOTUS case law establishes two distinct
| exceptions to the First Amendment - child pornography and
| obscenity. The first is easier to prosecute and more
| commonly prosecuted; the 2002 case held that "virtual child
| pornography" (made without the use of any actual children)
| does not fall into the scope of the child pornography
| exception - but it still falls into the scope of the
| obscenity exception. There is in fact a distinct federal
| crime for obscenity involving children as opposed to
| adults, 18 USC 1466A ("Obscene visual representations of
| the sexual abuse of children") [1] enacted in 2003 in
| response to this decision. Child obscenity is less commonly
| prosecuted, but in 2021 a Texas man was sentenced to 40
| years in prison over it [2] - that wasn't for GenAI, that
| was for drawings and text, but if drawings fall into the
| legal category, obviously GenAI images will too. So
| actually it turns out that even in the US, GenAI materials
| can legally count as CSAM, if we define CSAM to include
| both child pornography and child obscenity - and this has
| been true since at least 2003, long before the GenAI era.
|
| [0] https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_C
| oalit...
|
| [1] https://www.law.cornell.edu/uscode/text/18/1466A
|
| [2] https://www.justice.gov/opa/pr/texas-man-
| sentenced-40-years-...
| blackeyeblitzar wrote:
| Thanks for the information. However I am unconvinced that
| SCOTUS got this right. I don't think there should be a
| free speech exception for obscenity. If no other crime
| (like against a real child) is committed in creating the
| content, what makes it different from any other speech?
| skissane wrote:
| > However I am unconvinced that SCOTUS got this right. I
| don't think there should be a free speech exception for
| obscenity
|
| If you look at the question from an originalist
| viewpoint: did the legislators who drafted the First
| Amendment, and voted to propose and ratify it, understand
| it as an exceptionless absolute or as subject to
| reasonable exceptions? I think if you look at the
| writings of those legislators, the debates and speeches
| made in the process of its proposal and ratification,
| etc, it is clear that they saw it as subject to
| reasonable exceptions - and I think it is also clear that
| they saw obscenity as one of those reasonable exceptions,
| even though they no doubt would have disagreed about its
| precise scope. So, from an originalist viewpoint, having
| some kind of obscenity exception seems very
| constitutionally justifiable, although we can still
| debate how to draw it.
|
| In fact, I think from an originalist viewpoint the
| obscenity exception is on firmer ground than the child
| pornography exception, since the former is arguably as
| old as the First Amendment itself is, the latter only
| goes back to the 1982 case of _New York v. Ferber_. In
| fact, the child pornography exception, as a distinct
| exception, only exists because SCOTUS jurisprudence had
| narrowed the obscenity exception to the point that it was
| getting in the way of prosecuting child pornography as
| obscene - and rather than taking that as evidence that
| maybe they 'd narrowed it a bit too far, SCOTUS decided
| to erect a separate exception instead. But, conceivably,
| SCOTUS in 1982 could have decided to draw the obscenity
| exception a bit more broadly, and a distinct child
| pornography exception would never have existed.
|
| If one prefers living constitutionalism, the question is
| - has American society "evolved" to the point that the
| First Amendment's historical obscenity exception ought to
| jettisoned entirely, as opposed to merely be read
| narrowly? Does the contemporary United States have a
| moral consensus that individuals should have the
| constitutional right to produce graphic depictions of
| child sexual abuse, for no purpose other than their own
| sexual arousal, provided that no identifiable children
| are harmed in its production? I take it that is your
| personal moral view, but I doubt the majority of American
| citizens presently agree - which suggests that completely
| removing the obscenity exception, even in the case of
| virtual CSAM material, cannot currently be justified on
| living constitutionalist grounds either.
| dialup_sounds wrote:
| Weird take. The point of on-device scanning is to enable
| E2EE while still mitigating CSAM.
| sneak wrote:
| No, the point of on-device scanning is to enable
| authoritarian government overreach via a backdoor while
| still being able to add "end to end encryption" to a list
| of product features for marketing purposes.
|
| If Apple isn't free to publish e2ee software for mass
| privacy without the government demanding they backdoor it
| for cops on threat of retaliation, then we don't have
| first amendment rights in the USA.
| itake wrote:
| My understanding was the FP risk. The hashes were computed
| on device, but the device would self-report to LEO if it
| detects a match.
|
| People designed images that were FPs of real images. So
| apps like WhatsApp that auto-save images to photo albums
| could cause people a big headache if a contact shared a
| legal FP image.
| PittleyDunkin wrote:
| I don't think the problem there is the AI aspect
| itake wrote:
| My understanding was the FP risk. Everything was on device.
| People designed images that were FPs of real images.
| PittleyDunkin wrote:
| FP? Let us know what this means when you have a chance.
| Federal Prosecution? Fake Porn? Fictional Pictures?
| thinkmassive wrote:
| My guess is False Positive. Weird abbreviation to use
| though.
| pluc wrote:
| Probably because you need to feed it child porn so it can
| detect it...
| hirvi74 wrote:
| Already happened/happening. I have an ex-coworker that left
| my current employer for my state's version of the FBI. Long
| story short, the government has a massive database to
| crosscheck against. Often times, the would use automated
| processes to filter through suspicious data they would
| collect during arrests.
|
| If the automated process flags something as a potential
| hit, then they, the humans, would then review those images
| to verify. Every image/video that is discovered to be a hit
| is also inserted into a larger dataset as well. I can't
| remember if the Feds have their own DB (why wouldn't
| they?), but the National Center for Missing and Exploited
| Children run a database that I believe government agencies
| use too. Not to mention, companies like Dropbox, Google,
| etc.. all has against the database(s) as well.
| hinkley wrote:
| I'm wondering if, like looking out from behind a blanket at
| horror movies, if getting a moderately blurred copy of images
| would reduce the emotional punch of highly inappropriate
| pictures. Or just scaled down tiny.
|
| If it's already bad blurred or as a thumbnail don't click on
| the real thing.
| EasyMark wrote:
| I'd be fine with that as long as it was something I could
| turn off and on at will
| bigfatkitten wrote:
| This is more or less how police do CSAM classification now.
| They start with thumbnails, and that's usually enough to
| determine whether the image is a photograph or an
| illustration, involves penetration, sadism etc without having
| to be confronted with the full image.
| antegamisou wrote:
| You know what is going to end up happening though is something
| akin to the Tesla's "autonomous" Optimus robots.
| Havoc wrote:
| I would have hoped the previously-seen & clearly recognisable
| stuff already gets auto-flagged.
| LeftHandPath wrote:
| I think they use sectioned hashes for that sort of thing.
| They certainly do for eg ISIS videos, see
| https://blogs.microsoft.com/on-the-
| issues/2017/12/04/faceboo...
| 29athrowaway wrote:
| And then the problem is moved to the team curating data sets.
| sunaookami wrote:
| No, this just leads to more censorship without any option to
| appeal.
| krior wrote:
| Curious, do you have a better solution?
| throw__away7391 wrote:
| The solution to most social media problems in general is:
|
| `select * from posts where author_id in @follow_ids order
| by date desc`
|
| At least 90% of the ills of social media are caused by
| using algorithms to prioritize content and determine what
| you're shown. Before these were introduced, you just
| wouldn't see these types of things unless you chose to
| follow someone who chose to post it, and you didn't have
| people deliberately creating so much garbage trying to game
| "engagement".
| mulmen wrote:
| I'd love a chronological feed but if you gave me a choice
| I'd get rid of lists in SQL first.
|
| > select * from posts where author_id in @follow_ids
| order by date desc
|
| SELECT post FROM posts JOIN follows ON posts.author_id =
| follows.author_id WHERE follows.user_id =
| $session.user_id;
| soulofmischief wrote:
| Not if we retain control and each deploy our own moderation
| individually, relying on trust networks to pre-filter. That
| probably won't be allowed to happen, but in a rational, non-
| authoritarian world, this is something that machine learning
| can help with.
| jsemrau wrote:
| That's a workflow problem.
| slothtrop wrote:
| > without any option to appeal.
|
| Why would that be?
|
| Currently content is flagged and moderators decide whether to
| take it down. Using AI, it's easy conceive a process where
| some uploaded content is preflagged requiring an appeal
| (otherwise it's the same as before, a pair of human eyes
| automatically looking at uploaded material).
|
| Uploaders trying to publish rule-breaking content would not
| bother with an appeal that would reject them anyway.
| Eisenstein wrote:
| Because edge cases exist, and it isn't worth it for a
| company to hire enough staff to deal with them when one
| user with a problem, even if that problem is highly
| impactful to their life, just doesn't matter when the user
| is effectively the product and not the customer. Once the
| AI works well enough, the staff is gone and the cases where
| someone's business or reputation gets destroyed because
| there are no ways to appeal a wrong decision by a machine
| get ignored. And of course 'the computer won't let me' or
| 'I didn't make that decision' is a great way for no one to
| ever have to feel responsible for any harms caused by such
| a system.
| sunaookami wrote:
| This and social media companies in the EU tend to just
| delete stuff because of draconian laws where content must
| be deleted in 24 hours or they face a fine. So companies
| would rather not risk it. Moderators also only have a few
| seconds to decide if something should be deleted or not.
| slothtrop wrote:
| > because there are no ways to appeal
|
| I already addressed this and you're talking over it. Why
| are you making the assumption that AI == no appeal _and_
| zero staff? That makes zero sense, one has nothing to do
| with the other. The human element comes in for appeal
| process.
| Eisenstein wrote:
| > I already addressed this and you're talking over it.
|
| You didn't address it, you handwaved it.
|
| > Why are you making the assumption that AI == no appeal
| and zero staff?
|
| I explicitly stated the reason -- it is cheaper and it
| will work for the majority of instances while the edge
| cases won't result in losing a large enough user base
| that it would matter to them.
|
| I am not making assumptions. Google notoriously operates
| in this fashion -- for instance unless you are a very
| popular creator, youtube functions like that.
|
| > That makes zero sense, one has nothing to do with the
| other.
|
| Cheaper and mostly works and losses from people leaving
| are not more than the money saved by removing support
| staff makes perfect sense and the two things are related
| to each other like identical twins are related to each
| other.
|
| > The human element comes in for appeal process.
|
| What does a company have to gain by supplying the staff
| needed to listen to the appeals when the AI does a decent
| enough job 98% of the time? Corporations don't exist to
| do the right thing or to make people happy, they are
| extracting value and giving it to their shareholders. The
| shareholders don't care about anything else, and the way
| I described returns more money to them than yours.
| slothtrop wrote:
| > I am not making assumptions. Google notoriously
| operates in this fashion -- for instance unless you are a
| very popular creator, youtube functions like that.
|
| Their copyright takedown system has been around for many
| years and wasn't contingent on AI. It's a "take-down now,
| ask questions later" policy to please the RIAA and other
| lobby groups. Illegal/abuse material doesn't profit big
| business, their interest is in not having it around.
|
| You deliberately conflated moderation & appeal process
| from the outset. You can have 100% AI handling of suspect
| uploads (for which the volume is much larger) with a
| smaller staff handling appeals (for which the volume is
| smaller), mixed with AI.
|
| Frankly if your hypothetical upload is _still_ rejected
| after that, it 99% likely violates their terms of use, in
| which case there 's nothing to say.
|
| > it is cheaper
|
| A lot of things are "cheaper" in one dimension
| irrespective of AI, doesn't mean they'll be employed if
| customers dislike it.
|
| > the money saved by removing support staff makes perfect
| sense and the two things are related to each other like
| identical twins are related to each other.
|
| It does not make sense to have zero staff in as part of
| managing an appeal process (precisely to deal with edge
| cases and fallibility of AI), and it does not make sense
| to have no appeal process.
|
| You're jumping to conclusions. That is the entire point
| of my response.
|
| > What does a company have to gain by supplying the staff
| needed to listen to the appeals when the AI does a decent
| enough job 98% of the time?
|
| AI isn't there yet, notwithstanding, if they did a good
| job 98% of the time then who cares? No one.
| SoftTalker wrote:
| Nobody has a right to be published.
| sunaookami wrote:
| Then what is freedom of speech if every plattform deletes
| your content? Does it even exist? Facebook and co. are so
| ubiquitous, we shouldn't just apply normal laws to them.
| They are bigger than governments.
| henry2023 wrote:
| If this was the case then Facebook shouldn't be liable to
| moderate any content. Not even CSAM.
|
| Each government and in some cases provinces and
| municipalities should have teams to regulate content from
| their region?
| granzymes wrote:
| Freedom of speech means that the government can't punish
| you for your speech. It has absolutely nothing to do with
| your speech being widely shared, listened to, or even
| acknowledged. No one has the right to an audience.
| SoftTalker wrote:
| The government is not obligated to publish your speech.
| They just can't punish you for it (unless you cross a few
| fairly well-defined lines).
| chollida1 wrote:
| > Then what is freedom of speech if every platform
| deletes your content?
|
| Freedom of speech is between you and the government and
| not you and a private company.
|
| As the saying goes, if don't like your speach I can tell
| you to leave my home, that's not censorship, that's how
| freedom works.
|
| If I don't like your speach, I can tell yo to leave my
| property. Physical or virtual.
| henry2023 wrote:
| We're talking about Facebook here. You shouldn't have the
| assumption that the platform should be "uncensored" when it
| clearly is not.
|
| Furthermore, I'll rather have the picture of my aunt's
| vacation taken down by ai mistake rather than hundreds of
| people getting PSTD because they have to manually review if
| some decapitation was real or illustrated on an hourly basis.
| gorbachev wrote:
| Until the AI moderator flags your home videos as child porn,
| and you lose your kids.
| bdangubic wrote:
| I wish they get trillion dollars but I am sure they signed their
| life away via waivers and whatnots when they got the job :(
| zuminator wrote:
| Maybe so, but in places with good civil and human rights, you
| can't sign them away via contract, they're inalienable. If
| Kenya doesn't offer these protections, and the allegations are
| correct, then Facebook deserves to be punished regardless for
| profiting off inhumane working conditions.
| bdangubic wrote:
| absolutely!!
| shadowgovt wrote:
| Good! I hope they get every penny owed. It's an awful job and
| outsourcing if to jurisdictions without protection was naked harm
| maximization.
| sneak wrote:
| Perhaps if looking at pictures of disturbing things on the
| internet gives you PTSD than this isn't the kind of job for you?
|
| Not everyone can be a forensic investigator or coroner, too.
|
| I know lots of people who can and do look at horrible pictures on
| the internet and have been doing so for 20+ years with no ill
| effects.
| luqtas wrote:
| perhaps life on Kenya isn't easy as yours?
| wruza wrote:
| It isn't known in advance though. These people went to that job
| and got psychiatric diseases that, considering the
| thirdworldiness, they are unlikely to get rid of.
|
| I'm not talking about obvious "scream and run away" reaction
| here. One may think that it doesn't affect them or people on
| the internet, but then it suddenly does after they binge it all
| day for a year.
|
| The fact that not less than 100% got PTSD should be telling
| something here.
| eesmith wrote:
| The 100+ years of research on PTSD, starting from shell shock
| studies in WWI shows that PTSD isn't so simple.
|
| Some people come out with no problems, while their trenchmate
| facing almost identical situations suffers for the rest of
| their lives.
|
| In this case, the claim is that "it traumatised 100% of
| hundreds of former moderators tested for PTSD ... In any other
| industry, if we discovered 100% of safety workers were being
| diagnosed with an illness caused by their work, the people
| responsible would be forced to resign and face the legal
| consequences for mass violations of people's rights."
|
| Do those people you know look at horrible pictures on the
| internet for 8-10 hours each day?
| pluc wrote:
| Worked at PornHub's parent company for a bit and the moderation
| floor had a noticeable depressive vibe. Huge turnover. Can't
| imagine what these people were subjected to.
| HenryBemis wrote:
| You don't mention the year(s), but I recently listened to
| Jordan Peterson's podcast episode 503. One Woman's War on
| P*rnhub | Laila Mickelwait.
|
| I will go ahead and assume that on the wild/carefree time of
| PornHub, when anyone could be able to upload anything and
| everything, from what that lady said, the numbers of pedophilia
| videos, bestiality, etc. was rampant.
| pluc wrote:
| Yeah, it was during that time, before the great purge. It's
| not just sexual depravity, people used that site to host all
| kinds of videos that would get auto-flagged anywhere else
| (including, the least of it, full movies).
| chimeracoder wrote:
| > You don't mention the year(s), but I recently listened to
| Jordan Peterson's podcast episode 503. One Woman's War on
| P*rnhub | Laila Mickelwait.
|
| Laila Mickelwait is a director at Exodus Cry, formerly known
| as Morality in Media (yes, that's their original name).
| Exodus Cry/Morality in Media is an explicitly Christian
| organization that openly seeks to outlaw all forms of
| pornography, in addition to outlawing abortion and many gay
| rights including marriage. Their funding comes largely from
| right-wing Christian fundamentalist and fundamentalist-
| aligned groups.
|
| Aside from the fact that she has an axe to grind, both she
| (as an individual) and the organization she represents have a
| long history of misrepresentating facts or outright lying in
| order to support their agenda. They also intentionally and
| openly refer to all forms of sex work (from consensual
| pornography to stripping to sexual intercourse) as
| "trafficking", against the wishes of survivors of actual sex
| trafficking, who have extensively document why Exodus Cry
| actually perpetuates harm against sex trafficking victims.
|
| > everything, from what that lady said, the numbers of
| pedophilia videos, bestiality, etc. was rampant.
|
| This was disproven long ago. Pornhub was actually quite good
| about proactively flagging and blocking CSAM and other
| objectionable content. Ironically (although not surprisingly,
| if you're familiar with the industry), Facebook was two to
| three orders of magnitude worse than Pornhub.
|
| But of course, Facebook is not targeted by Exodus Cry because
| their mission - as you can tell by their original name of
| "Morality in Media" - is to ban pornography on the Internet,
| and going after Facebook doesn't fit into that mission, even
| though Facebook is actually way worse for victims of CSAM and
| trafficking.
| whacko_quacko wrote:
| Sure, but who did the proactive flagging back then?
| Probably moderators. Seems like a shitty job nonetheless
| bigfatkitten wrote:
| As far as I can tell, Facebook is still terrible.
|
| I have a throwaway Facebook account. In the absence of any
| other information as to my interests, Facebook thinks I
| want to see flat earth conspiracy theories and CSAM.
|
| When I report the CSAM, I usually get a response that says
| "we've taken a look and found that this content doesn't go
| against our Community Standards."
| jkestner wrote:
| Borrowing the thought from Ed Zitron, but when you think about
| it, most of us are exposing ourselves to low-grade trauma when we
| step onto the internet now.
| rnewme wrote:
| That's the risk of being in a society in general, it's just
| that we interact with people outside way less now. If one
| doesn't like it, they can always be a hermit.
| jkestner wrote:
| Not just that, but that algorithms are driving us to the
| extremes. I used to think it was just that humans were not
| meant to have this many social connections, but it's more
| about how these connections are mediated, and by whom.
|
| Worth reading Zitron's essay if you haven't already. It
| sounds obvious, but the simple cataloging of all the
| indignities we take for granted builds up to a bigger
| condemnation than just Big Tech.
| https://www.wheresyoured.at/never-forgive-them/
| kelseyfrog wrote:
| Is there any way to look at this that doesn't resort to black
| or white thinking? That's a rather extreme view in itself
| that could use some nuance and moderation.
| sellmesoap wrote:
| What's more; popular TV shows regularly have scenes that could
| cause trauma, the media has been ramping up the intensity of
| content for years. I think it's simply seeking more word of
| mouth 'did you see GoT last night? Oh my gosh so and so did
| such and such to so and so!'
| aquariusDue wrote:
| It really became apparent to me when I watched the FX remake
| of Shogun, the 1980 version seems downright silly and
| carefree by comparison.
| blueflow wrote:
| I'm curious about the contents that these people moderated. What
| is it that seeing it fucks people up?
| bdangubic wrote:
| things that you cannot unsee, the absolute worst of humanity
| crystal_revenge wrote:
| From the first paragraph of the article:
|
| > post-traumatic stress disorder caused by exposure to graphic
| social media content including murders, suicides, child sexual
| abuse and terrorism.
|
| If you want a taste of the _legal_ portion of theses just got
| to 4chan.org /gif/catalog and look for a "rekt", "war", "gore",
| or "women hate" thread. Watch every video there for 8-10 hours
| a day.
|
| Now remember this is the _legal_ portion of the content
| moderated as 4chan does a good job these days of removing
| _illegal_ content mentioned in that list above. So all these
| examples will be a _milder_ sample of what moderators deal
| with.
|
| And do remember to browse for 8-10 hours a day.
|
| edit: it should go without saying that the content there is
| deep in the NSFW territory, and if you haven't already stumbled
| upon that content, I do not recommend browsing "out of
| curiosity".
| dyauspitr wrote:
| As someone that grew up with 4chan I got pretty desensitized
| to all of the above very quickly. Only thing I couldn't watch
| was animal abuse videos. That was all yers ago though, now
| I'm fully sensitized to all of it again.
| sandspar wrote:
| The point is that you don't know which one will stick. Even
| people who are desensitized will remember certain things, a
| person's facial expression or a certain sound or something
| like that, and you can't predict which one will stick with
| you.
| azinman2 wrote:
| Did your parents know what you were seeing? Advice to
| others to not have kids see this kind of stuff, let alone
| get desensitized to it?
|
| What drew you to 4chan?
| dyauspitr wrote:
| Of course not. What drew me in was the edginess. What
| kept me there was the very dark but funny humor. This was
| in 2006-2010, it was all brand new, it was exciting.
|
| I have a kid now and my plan is to not give her a
| smartphone/social media till she's 16 and heavily monitor
| internet access until she's atleast 12. Obviously I can't
| control what she will see with friends but she goes to a
| rigorous school and I'm hoping that will keep her busy.
| Other than that I'm hoping the government comes down hard
| on social media access to kids/teenagers and all the
| restrictions are legally codified by the time she's old
| enough.
| 6yyyyyy wrote:
| That fucking guy torturing monkeys :(
| numpad0 wrote:
| These accounts like yours and this report of PTSD don't
| line up. Both of them are credible. What's driving them
| crazy but not Old Internet vets?
|
| Could it be: - the fact that moderators are
| hired and paid - that kids are young and a lot more
| tolerant - that moderators aren't intended
| audiences - backgrounds, sensitivity in media at
| all - the amount, of disturbing images -
| the amount, in total, not just bad ones - anything
| else?
|
| Personally, I'm suspecting that difference in exposure to
| _any kind of media_ might be a factor; I've come across
| stories online that imply visiting and staying at places
| like Tokyo can almost drive people crazy, from the amount
| of stimuli alone.
|
| Doesn't it sound a bit too shallow and biased to determine
| it was specifically CSAM or whatever specific type of data
| that did it?
| kernal wrote:
| There was a report by 60 minutes (I think) on this fairly
| recently. I'm not surprised the publicity attracted lawyers
| soon after.
| percentcer wrote:
| it's kinda crazy that they have normies doing this job
| istjohn wrote:
| Normies? As opposed to who?
| atleastoptimal wrote:
| Obvious job that would benefit everyone for AI to do instead of
| humans.
| 1vuio0pswjnm7 wrote:
| Perhaps this is what happens when someone creates a mega-sized
| website comprising hundreds of millions of pages using other
| peoples' submitted material, effectively creating a website that
| is too large to "moderate". By letting the public publish their
| material on someone else's mega-sized website instead of hosting
| their own, perhaps it concentrates the web audience to make it
| more suitable for advertising. Perhaps if the PTSD-causing
| material was published by its authors on the authors' own
| websites, the audience would be small, not suitable for
| advertising. A return to less centralised web publishing would
| perhaps be bad for the so-called "ad ecosystem" created by so-
| called "tech" company intermediaries. To be sure, it would also
| mean no one in Kenya would be intentionally be subjected to PTSD-
| causing material in the name of fulfilling the so-called "tech"
| industry's only viable "business model": surveillance, data
| collection and online ad services.
| croissants wrote:
| A return to less centralized web publishing would also be bad
| for the many creators who lack the technical expertise or
| interest to jump through all the hoops required for building
| and hosting your own website. Maybe this seems like a pretty
| small friction to the median HN user, but I don't think it's
| true for creators in general, as evidenced by the enormous
| increase in both the number and sophistication of online
| creators over the past couple of decades.
|
| Is that increase worth traumatizing moderators? I have no idea.
| But I frequently see this sentiment on HN about the old
| internet being better, framed as criticism of big internet
| companies, when it really seems to be at least in part
| criticism of how the median internet user has changed -- and
| the solution, coincidentally, would at least partially reverse
| that change.
| moomin wrote:
| I mean, the technical expertise thing is solvable, it's just
| no-one wants to solve it because SaaS is extremely
| lucrative."
| apitman wrote:
| Content hosting for creators can be commoditized.
|
| Content discovery may even be able to remain centralized.
|
| No idea if there's a way for it to work out economically
| without ads, but ads are also unhealthy so maybe that's ok.
| lalalali wrote:
| Introducing a free unlimited hosting service where you
| could only upload pictures, text or video. There's a public
| page to see that content among adds and links to you
| friends free hosting service pages. TOS is a give-give: you
| give them the right to extract all the aggregated stat they
| want and display the adds, they give you the service for
| free so you own you content (and are legally responsible of
| it)
| coryrc wrote:
| It's a problem when you don't verify the identity of your users
| and hold them responsible for illegal things. If Facebook
| verified you were John D SSN 123-45-6789 they could report you
| for uploading CSAM and otherwise permanently block you from
| using the site if uploading objectionable material; meaning
| only exposure to horrific things is only necessary once per
| banned user. I would expect that to be orders of magnitude less
| than what they deal with today.
| renewiltord wrote:
| You can thank online privacy activists for this.
| DaSHacka wrote:
| You can thank IRL privacy activists for the lack of cameras
| in every room in each house; Just imagine how much faster
| domestic disputes could be resolved!
| renewiltord wrote:
| Sure, there's a cost-benefit to it. We think that privacy
| is more important than rapid resolution of domestic
| disputes and we think that privacy is more important than
| stopping child porn. That's fine as a statement.
| IshKebab wrote:
| Rubbish. The reason Facebook doesn't want to demand ID for
| most users is that it adds friction to using their product,
| which means fewer users and less profit.
| kelseyfrog wrote:
| Unsurprising lack of response to this statement. It's 100%
| true, and any cost-benefit calculation of privacy should
| account for it.
| xvector wrote:
| This is the one job we can probably automate now.
| oefrha wrote:
| They should probably hire more part time people working one hour
| a day?
|
| Btw, it's probably a different team handling copyright claims,
| but my run-in with Meta's moderation gives me the impression that
| they're probably horrifically understaffed. I was helping a
| Chinese content creator friend taking down Instagram, YouTube and
| TikTok accounts re-uploading her content and/or impersonating her
| (she doesn't have any presence on these platforms and doesn't
| intend to). Reported to TikTok twice, got it done once within a
| few hours (I was impressed) and once within three days. Reported
| to YouTube once and it was handled five or six days later. No
| further action was needed from me after submitting the initial
| form in either case. Instagram was something else entirely; they
| used Facebook's reporting system, the reporting form was the
| worst, it asked for very little information upfront but kept
| sending me emails afterwards asking for more information, then
| eventually radio silence. I sent follow-ups asking about
| progress, again, radio silence. Impersonation account with
| outright stolen content is still up till this day.
| wkat4242 wrote:
| I have several friends who do this work for various platforms.
|
| The problem is, someone has to do it. These platforms are
| mandated by law to moderate it or else they're responsible for
| the content the users post. And the companies can not shield
| their employees from it because the work simply needs doing. I
| don't think we can really blame the platforms (though I think the
| remuneration could be higher for this tough work).
|
| The work tends to suit some people better than others. The same
| way some people will not be able to be a forensic doctor doing
| autopsies. Some have better detachment skills.
|
| All the people I know that do this work have 24/7 psychologists
| on site (most of them can't work remotely due to the private
| content they work with). I do notice though that most of them do
| have an "Achilles heel". They tend to shrug most things off
| without a second thought but there's always one or two specific
| things or topics that haunt them.
|
| Hopefully eventually AI will be good enough to deal with this
| shit. It sucks for their jobs or course but it's not the kind of
| job anyone really does with pleasure.
| ternnoburn wrote:
| Someone has to do it is a strong claim. We could not have the
| services that require it instead.
| EdgeSlash wrote:
| Absolutely. The platforms could reasonably easy stop allowing
| anonymous accounts. They don't because more users means more
| money.
| wkat4242 wrote:
| Uhh no I'm not giving up my privacy because a few people
| want to misbehave. Screw that. My friends know who I am but
| the social media companies shouldn't have to.
|
| Also, it'll make social media even more fake than it
| already is. Everyone trying to be as fake as possible. Just
| like LinkedIn is now. It's sickening, all these people
| toting the company line. Even though they do nothing but
| complain when you speak to them in person.
|
| And I don't think it'll actually solve the problem. People
| find ways to get through the validation with fake IDs.
| ternnoburn wrote:
| Not what I was saying. I'm questioning the need for the
| thing entirely.
| Der_Einzige wrote:
| So brown/black people in the third world who often find that
| this is their only meaningful form of social mobility are the
| "someone" by default? Because that's the de-facto world we
| have!
| wkat4242 wrote:
| That's not true at all. All the people I speak of are here in
| Spain. They're generally just young people starting a career.
| Many of them end up in the fringes of cybersecurity work
| (user education etc) actually because they've seen so many
| scams. So it's the start of a good career.
|
| Sure some companies would outsource also to africa but it
| doesn't mean this work is only available to third-world
| countries. And there's not that many jobs in it. It's more
| than possible to be able to find enough people that can
| stomach it.
|
| There was another article a few years back about the poor
| state of mental health of Facebook moderators in Berlin. This
| is not exclusively a poor people problem. More of a wrong
| people for the job problem.
|
| And of course we should look more at why this is the only
| form of social mobility for them if it's really the case.
| azinman2 wrote:
| > The moderators from Kenya and other African countries were
| tasked from 2019 to 2023 with checking posts emanating from
| Africa and in their own languages but were paid eight times less
| than their counterparts in the US, according to the claim
| documents
|
| Why would pay in different countries be equivalent? Pretty sure
| FB doesn't even pay the same to their engineers depending on
| where in the US they are, let alone which country. Cost of living
| dramatically differs.
| guerrilla wrote:
| > Why would pay in different countries be equivalent?
|
| Because it's exploitative otherwise. It's just exploiting the
| fact that they're imprisoned within borders.
| jnwatson wrote:
| It is also exploiting the fact that humans need food and
| shelter to live and money is used to acquire those things.
| guerrilla wrote:
| That's only exploitation if you combine it with fact of the
| enclosure of the commons and that all land and productive
| equipment on Earth is private or state property and that
| it's virtually impossible to just go farm or hunt for
| yourself without being fucked with anymore, let alone do
| anything more advanced without being shut down violently.
| apitman wrote:
| That's actually an interesting question. I would love to
| see some data on whether it really is impossible for the
| average person to live off the land if they wanted to.
|
| An adjacent question is whether there are too many people
| on the planet for that to be an option anymore even if it
| were legal.
| fluoridation wrote:
| >An adjacent question is whether there are too many
| people on the planet for that to be an option anymore
| even if it were legal.
|
| Do you mean for _everyone_ to be hunter-gatherers? Yes,
| that would be impossible. If you mean for a smaller
| number then it depends on the number.
| apitman wrote:
| Yeah I think it would be interesting to know how far over
| the line we are.
| fluoridation wrote:
| Probably way, way over the line. Population sizes
| exploded after the agricultural revolution. I wouldn't be
| surprised if the maximum is like 0.1-1% of the current
| population. If we're talking about strictly eating what's
| available without any cultivation at all, nature is
| really inefficient at providing for us.
| gruez wrote:
| >the enclosure of the commons and that all land and
| productive equipment on Earth is private or state
| property and that it's virtually impossible to just go
| farm or hunt for yourself without being fucked with
| anymore, let alone do anything more advanced without
| being shut down violently.
|
| How would land allocation work without "enclosure of the
| commons"? Does it just become a free-for-all? What
| happens if you want to use the land for grazing but
| someone else wants it for growing crops? "enclosure of
| the commons" conveniently solves all these issues by
| giving exclusive control to one person.
| guerrilla wrote:
| Elinor Ostrom covered this extensively in her Nobel
| Prize-winning work if you are genuinely interested.
| Enclosure of the commons is not the only solution to the
| problems.
| mcntsh wrote:
| Interesting perspective. I wonder if you yourself take part
| in the exploitation by purchasing things made/grown in poor
| countries due to cost.
| numpad0 wrote:
| vegans die of malnutrition.
| throwie21873 wrote:
| There's no ethical consumption under capitalism.
| fluoridation wrote:
| If that's the case then there can also be no ethical
| employment, either, both for employer and for employee.
| So that would seem to average out to neutrality.
| gruez wrote:
| You haven't actually explained why it's bad, only slapped an
| evil sounding label on it. What's "exploitative" in this case
| and why is it morally wrong?
|
| >they're imprisoned within borders
|
| What's the implication of this then? That we remove all
| migration controls?
| guerrilla wrote:
| Of course. Not all at once, but gradually over time like
| the EU has begun to do. If capital and goods are free to
| move, then so must labor be. The labor market is very far
| from free if you think about it.
| MacsHeadroom wrote:
| Paying local market rates is not exploitative.
| guerrilla wrote:
| Artificially creating local market rates by trapping people
| is.
| lvzw wrote:
| In what sense were these local rates "created
| artificially"? Are you suggesting that these people are
| being forced to work agaisnt their will?
| meiraleal wrote:
| > Why would pay in different countries be equivalent?
|
| Why 8 times less?
| renewiltord wrote:
| Because that's the only reason why anyone would hire them. If
| you've ever worked with this kind of contract workforce they
| aren't really worth it without massive cost-per-unit-work
| savings. I suppose one could argue it's better that they be
| unemployed than work in this job but they always choose
| otherwise when given the choice.
| apitman wrote:
| Because people chose to take the jobs, so presumably they
| thought it was fair compensation compared to alternatives.
| Unless there's evidence they were coerced in some way?
|
| Note that I'm equating all jobs here. No amount of
| compensation makes it worth seeing horrible things. They are
| separate variables.
| fluoridation wrote:
| _No amount_? So you wouldn 't accept a job to moderate
| Facebook for a million dollars a day? If you would, then
| surely you would also do it for a lower number. There is an
| equilibrium point.
| apitman wrote:
| > So you wouldn't accept a job to moderate Facebook for a
| million dollars a day?
|
| I would hope not.
| fluoridation wrote:
| Sorry, but I don't believe you. You could work for a
| month or two and retire. Or hell, just do it for one day
| and then return to your old job. That's a cool one mill
| in the bank.
|
| My point is, job shittiness can be priced in.
| lalalali wrote:
| > work for a month or two and retire --> This is a dream
| of many, but there exist a set of people that really like
| their job and have no intention to retire
|
| > just do it for one day and then return to your old job.
| --> Cool mill in the bank and dreadful images in your
| head. Perhaps Apitman feels he has enough cash and wont
| be happier with a million (more?).
|
| Also your point is true but lacks of Facebook interest to
| elevate that number. I guess it was more a theorical
| reflexion than an argument for concrete economie.
| tyre wrote:
| GDP per capita in Kenya is a little less than $2k. In the
| United States, it's a bit over $81k.
|
| Median US salary is about $59k. Gross national income (not an
| identical measure but close) in Kenya about $2.1k.
|
| 1/8th is disproportionately in favor of the contractors,
| relative to market.
| wyager wrote:
| Because prices are determined by supply and demand
| meiraleal wrote:
| The same is true for poverty and the poor that will work
| for any amount, the cheap labor the rich needs to make
| riches.
| abdullahkhalids wrote:
| Some products have factories in multiple countries. For
| example, Teslas are produced in both US and China. The cars
| produced in both countries are more or less identical in
| quality. But do you ever see that the market price of the
| product is different depending on the country of manufacture?
|
| If the moderators in Kenya are providing the same quality labor
| as those from the US, why the difference in price of their
| labor?
|
| I have a friend who worked for FAANG and had to temporarily
| move from US to Canada due to visa issues, while continuing to
| work for the same team. They were paid less in Canada. There is
| no justification for this except that the company has price
| setting power and uses it to exploit the sellers of labor.
| azinman2 wrote:
| A million things factor into market dynamics. I don't know
| why this is such a shocking or foreign concept. Why is a
| waitress in Alabama paid less than in San Francisco for the
| same work? It's a silly question because the answers are both
| obvious and complex.
| pllbnk wrote:
| There have been multiple instances where I would receive invites
| or messages from obvious bots - users having no history, generic
| name, sexualised profile photo. I would always report them to
| Facebook just to receive a reply an hour or a day later that no
| action has been taken. This means there is no human in the
| pipeline and probably only the stuff that's not passing their
| abysmal ML filter goes to the actual people.
|
| I also have a relative who is stuck with their profile being
| unable to change any contact details, neither email nor password
| because FB account center doesn't open for them. Again, there is
| no human support.
|
| BigTech companies must be mandated by law to have the number of
| live support people working and reachable that is a fixed
| fraction of their user number. Then, they would have no incentive
| to inflate their user numbers artificially. As for the
| moderators, there should also be a strict upper limit on the
| number of content (content tokens, if you will) they should view
| during their work day. Then the companies would also be more
| willing to limit the amount of content on their systems.
|
| Yeah, it's bad business for them but it's a win for the people.
| yodsanklai wrote:
| I'm wondering if there are precedents in other domains. There are
| other jobs where you do see disturbing things as part of your
| duty. E.g. doctors, cops, first responders, prison guards and so
| on...
|
| What makes moderation different? and how should it be handled so
| that it reduces harm and risks? surely banning social media or
| not moderating content aren't options. AI helps to some extent
| but doesn't solve the issue entirely.
| whaleofatw2022 wrote:
| From those I know that worked in the industry, contractor
| systems are frequently abused to avoid providing the right
| level of counseling/support to moderators.
| sd9 wrote:
| I don't have any experience with this, so take this with a
| pinch of salt.
|
| What seems novel about moderation is the frequency that you
| confront disturbing things. I imagine companies like Meta have
| such good automated moderation that what remains to be viewed
| by a human is practically a firehose of almost certainly
| disturbing shit. And as soon as you're done with one post, the
| next is right there. I doubt moderators spend more than 30
| seconds on the average image, which is an awful lot of stuff to
| see in one day.
|
| A doctor just isn't exposed to that sort of imagery at the same
| rate.
| prng2021 wrote:
| "I would imagine that companies like Meta have such good
| automated moderation that what remains to be viewed by a
| human is practically a firehose of almost certainly
| disturbing shit."
|
| This doesn't make sense to me. Their automated content
| moderation is so good that it's unable to detect "almost
| certainly disturbing shit"? What kind of amazing automation
| only works with subtleties but not certainties?
| sd9 wrote:
| I assumed that, at the margin, Meta would prioritise
| reducing false negatives. In other words, they would prefer
| that as many legitimate posts are published as possible.
|
| So the things that are flagged for human review would be on
| the boundary, but trend more towards disturbing than
| legitimate, on the grounds that the human in the loop is
| there to try and publish as many posts as possible, which
| means sifting through a lot of disturbing stuff that the AI
| is not sure about.
|
| There's also the question of training the models - the
| classifiers may need labelled disturbing data. But possibly
| not these days.
|
| However, yes, I expect the absolute most disturbing shit to
| never be seen by a human.
|
| --
|
| Again, literally no experience, just a guy on the internet
| pressing buttons on a keyboard.
| gruez wrote:
| >In other words, they would prefer that as many
| legitimate posts are published as possible.
|
| They'd prefer that as many posts are published, but they
| probably also don't mind some posts being removed if it
| meant saving a buck. When canada and australia
| implemented a "link tax", they were happy to ban all news
| content to avoid paying it.
| sd9 wrote:
| Yes, Meta are economically incentivised to reduce the
| number of human reviews (assuming the cost of improving
| the model is worthwhile).
|
| This probably means fewer human reviewers reviewing a
| firehose, not the same number of human reviewers
| reviewing content at a slower rate.
| itake wrote:
| I'd think the higher density/frequency of disturbing content
| would cause people to be desensitized.
|
| I never seen blood or gore in my life and find seeing it
| shocking.
|
| But I'd imagine gore is a weekly situation for surgeons.
| sd9 wrote:
| I agree. But that might be comorbid with PTSD. It's
| probably not good for you to be _that_ desensitised to this
| sort of thing.
|
| I also feel like there's something intangible regarding
| intent that makes moderation different from being a doctor.
| It's hard for me to put into words, but doctors see gore
| because they can hopefully do something to help the
| individual involved. Moderators see gore but are powerless
| to help the individual, they can only prevent others from
| seeing the gore.
| diggan wrote:
| It's also the type of gore that matters. Some of the
| worst stuff I've seen wasn't the worst because of the
| visuals, but because of the audio. Hearing people begging
| for their life while being executed surely would feel
| different to even a surgeon who might be used to digging
| around in people's bodies.
| itake wrote:
| There are many common situations where professionals are
| helpless, like people that needs to clean up dead bodies
| after an accident.
| sangnoir wrote:
| Imagine if this becomes a specialized, remote job where
| one tele-operates the brain and blood scrubbing robot all
| workday long, accident, after accident after accident. I
| am sure they'd get PTSD too, airey, sometime it's just
| oil and coolant, but there's still a lot of body-tissue
| involved.
| crawfordcomeaux wrote:
| Desensitization is only one stage of it. It's not permanent
| & requires dissociation from reality/humanity on some
| level. But that stuff is likely to come back and haunt one
| in some way. If not, it's likely a symptom of something
| deeper going on.
|
| My guess is that's why it's after bulldozing hundreds of
| Palestinians, instead of 1 or 10s of them, that Israeli
| soldiers report PTSD.
|
| If you haven't watched enough videos of the ongoing
| genocides in the world to realize this, it'll be a
| challenge to have a realistic take on this article.
| 0_____0 wrote:
| I watch surgery videos sometimes, out of fascination. It's
| not gore to me - sure it's flesh and blood but there is a
| person whose life is going to be probably significantly
| better afterwards. They are also not in pain.
|
| I exposed myself to actual gore vids in the aughts and
| teens... That stuff still sticks with me in a bad way.
|
| Context matters a lot.
| itake wrote:
| > They are also not in pain.
|
| My understanding is that during surgery, your body is
| most definitely in pain. Your body still reacts as it
| would to any damage, but anesthetics block the pain
| signals from reaching the brain.
| mewpmewp2 wrote:
| But there is a difference between someone making an
| effort healing someone else vs content with implications
| that something really disturbing happened that makes you
| lose faith in humanity.
| lm28469 wrote:
| Also a doctor is paid $$$$$ and it mostly is a vocational job
|
| Content moderator is a min wage job with bad working hours,
| no psychological support, and you spend your day looking at
| rape, child porn, torture and executions.
| gruez wrote:
| >Also a doctor is paid $$$$$
|
| >Content moderator is a min wage job
|
| So it's purely a monetary dispute?
|
| >bad working hours, no psychological support, and you spend
| your day looking at rape, child porn, torture and
| executions.
|
| Many other jobs have the same issues, though admittedly
| with less frequency, but where do you draw the line?
| diggan wrote:
| > but where do you draw the line?
|
| How about grouping the jobs into two categories: A)
| Causes PTSD and B) Doesn't cause PTSD
|
| If a job as a constantly high percentage of people ending
| up with PTSD, then they aren't equipped well enough to
| handle it, by the company who employs them.
| gruez wrote:
| >How about grouping the jobs into two categories: A)
| Causes PTSD and B) Doesn't cause PTSD
|
| I fail to see how this addresses my previous questions of
| "it's purely a monetary dispute?" and "where do you draw
| the line?". If a job "Causes PTSD" (whatever that means),
| then what? Are you entitled to hazard pay? Does this work
| out in the end to a higher minimum wage for certain jobs?
| Moreover, we don't have similar classifications for other
| hazards, some of which are arguably worse. For instance,
| dying is probably worse than getting PTSD, but the most
| dangerous jobs have pay that's well below the national
| median wage[1][2]. Should workers in those jobs be able
| to sue for redress as well?
|
| [1] https://www.ishn.com/articles/112748-top-25-most-
| dangerous-j...
|
| [2] https://www.bls.gov/oes/current/oes_nat.htm
| kulahan wrote:
| What could a company provide a police officer with to
| prevent PTSD from witnessing a brutal child abuse case? A
| number of sources i found estimate the top of the range
| to be ~30% of police officers may be suffering from it
|
| [1] https://www.policepac.org/uploads/1/2/3/0/123060500/t
| he_effe...
| portaouflop wrote:
| You can't prevent it but you can help deal with it later.
| bloppe wrote:
| > So it's purely a monetary dispute?
|
| I wouldn't say purely, but substantially yes. PTSD has
| costs. The article says some out; therapy, medication,
| mental, physical, and social health issues. Some of these
| money can directly cover, whereas others can only be
| kinda sorta justified with high enough pay.
|
| I think a sustainable moderation industry would try hard
| to attract the kinds of people who are able to perform
| this job without too much negative impacts, and quickly
| relieve those who try but are not well suited, and pay
| for some therapy.
| ajb wrote:
| Also doctors are very frequently able to do something about
| it. Being powerless is a huge factor in mental illness.
| tossandthrow wrote:
| > I imagine companies like Meta have such good automated
| moderation that what remains to be viewed by a human is
| practically a firehose of almost certainly disturbing shit.
|
| On the contrary I would expect that it would be the edge
| cases that they were shown - why loop in a content moderator
| if you an be sure that it is prohibited ont he platform
| without exposing a content moderator?
|
| In this light, it might make sense why they sue: They are
| there more as a political org so that facebook can say: "We
| employ 140 moderators in Kenya alone!" while they do
| indifferent work that facebook already can filter out.
| Retric wrote:
| Even if 1% of images are disturbing that's multiple per
| hour, let anyone across months.
|
| US workman's comp covers PTSD acquired on the job, and
| these kinds of jobs are rife with it.
| aoanevdus wrote:
| Content moderation also involves reading text, so you'd
| imagine that there's a benefit to having people who can
| label data and provide ground truth in any language you're
| moderating.
|
| Even with images, you can have different policies in
| different places or the cultural context can be relevant
| somehow (eg. some country makes you ban blasphemy).
|
| Also, I have heard of outsourcing to Kenya just to save
| cost. Living is cheaper there so you can hire a desk worker
| for less. Don't know where the insistence you'd only hire
| Kenyans for political reasons comes from.
| pizza wrote:
| > They are there more as a political org so that facebook
| can say: "We employ 140 moderators in Kenya alone!" while
| they do indifferent work that facebook already can filter
| out.
|
| Why assume they're just token diversity hires who don't do
| useful work..?
|
| Have you ever built an automated content moderation system
| before? Let me tell you something about them if not: no
| matter how good your automated moderation tool, it is
| pretty much always trivial for someone with familiarity
| with its inputs and outputs to come up with an input it
| mis-predicts embarrassingly badly. And you know what makes
| the biggest difference.. is humans specifying the labels.
| tossandthrow wrote:
| I don't assume diversity hires, I assume that these
| people work for the Kenyan part of Facebook and that
| Facebook employs an equivalent workforce elsewhere.
|
| I am also not saying that content moderation should catch
| everything.
|
| What I am saying is that the content moderation teams
| should ideally decide on the edge cases as they are hard
| for automated system.
|
| In turn that also means that these people ought not to be
| exposed to too hardcore material - as that is easier to
| classify.
|
| Lastly I say that if that is not the case - then they are
| probably not there to carry out a function but to fill a
| political role.
| mrweasel wrote:
| > I imagine companies like Meta have such good automated
| moderation
|
| I imagine that they have a system that is somewhere between
| shitty and none functional. This is the company that will
| more often than not flag marketplace posts as "Selling
| animal", either completely at random or because the pretty
| obvious "from an animal free home" phrase is used.
|
| If they can't get this basic text parsing correct, how can
| you expect them to correctly flag images with any real sense
| of accuracy?
| athrowaway3z wrote:
| I'm not sure your comparisons are close enough to be considered
| precedents.
|
| My guess is even standing at the ambulance drive in of a big
| hospital, you'll not see as much horrors in a day as these
| people see in 30 minutes.
| lazide wrote:
| Outside of some specific cities, I can guarantee it. Even a
| busy Emergency Dept on Halloween night had only a small
| handful of bloody patients/trauma cases, and nothing truly
| horrific when I did my EMT rotation.
| s1artibartfast wrote:
| My friends who are paramedics have seen some horrific scenes.
| They have also been shot, stabbed, and suffered lifelong
| injuries.
|
| They are obviously not identical scenarios. They have
| similarities and they also have differences.
| fcmgr wrote:
| A friend's friend is a paramedic and as far as I remember they
| can take the rest of a day off after witnessing death on duty
| and there's an obligatory consulation with a mental healthcare
| specialist. From reading the article, it seems like those
| moderators are seeing horrific things almost constantly
| throughout the day.
| Ray20 wrote:
| Sounds crazy. Just imagine dying because paramedic
| responsible for your survival just wanted end his day early.
| _qua wrote:
| I've never heard of a policy like that for physicians and
| doubt it's common for paramedics. I work in an ICU and a
| typical day involves a death or resuscitation. We would run
| out of staff with that policy.
| kranke155 wrote:
| Maybe a different country than yours ?
| wongarsu wrote:
| Maybe it's different in the US where ambulances cost money,
| but here in Germany the typical paramedic will see a wide
| variety of cases, with the vast majority of patients
| surviving the encounter. Giving your paramedic a day off
| after witnessing a death wouldn't break the bank. In the
| ICU or emergency room it would be a different story.
| _qua wrote:
| Ambulances cost money everywhere, it's just a matter of
| who is paying. Do we think paramedics in Germany are more
| susceptible to PTSD when patients die than ICU or ER
| staff, or paramedics anywhere?
| wongarsu wrote:
| > Ambulances cost money everywhere
|
| Not in the sense that matters here: the caller doesn't
| pay (unless the call is frivolous), leading to more calls
| that are preemptive, overly cautious or for non-live-
| threatening cases. That behind the scenes people and
| equipment are paid for and a whole structure to do that
| exists isn't really relevant here
|
| > Do we think paramedics in Germany are more susceptible
| to PTSD
|
| No, we think that there are far more paramedics than ICU
| or ER staff, and helping them in small ways is pretty
| easy. For ICU and ER staff you would obviously need other
| measures, like staffing those places with people less
| likely to get PTSD or giving them regular counseling by a
| staff therapist (I don't know how this is actually
| handled, just that the problem is very different than the
| issue of paramedics)
| fcmgr wrote:
| I might have misremembered that, but remember hearing the
| story. Now that I think about it I think that policy was
| applied only after unsuccessful CPR attempts.
| magicalhippo wrote:
| My friend has repeatedly mentioned his dad became an
| alcoholic due to what he saw as a paramedic. This was back in
| the late 80s, early 90s so not sure they got any mental
| health help.
| insane_dreamer wrote:
| at least in the US, those jobs - doctors, cops, firefighters,
| first responders - are well compensated (not sure about prison
| guards), certainly compared to content moderators who are at
| the bottom of the totem pole in an org like FB
| caymanjim wrote:
| What does compensation have to do with it? Is someone who
| stares at thousands of traumatizing, violent images every day
| going to be less traumatized if they're getting paid more?
| t-3 wrote:
| Yes, they will be much more able to deal with the
| consequences of that trauma than someone who gets a
| pittance to do the same thing. A low-wage peon won't even
| be able to afford therapy if they need it.
| unaindz wrote:
| At least they can pay for therapy and afford to stop
| working or find another job
| BoxFour wrote:
| Shamefully, first responders are not well compensated -
| usually it's ~$20 an hour.
| kevin_thibedeau wrote:
| I've lived places where the cops make $100k+. It all
| depends on location.
| BoxFour wrote:
| Sorry - I'm specifically referring to EMTs and
| Paramedics, who usually make somewhere in the realm of
| $18-25 an hour.
| croes wrote:
| Doctors, cops, first responders, prison guards see different
| horrible things.
|
| Content moderators see all of that.
| amelius wrote:
| Don't forget judges, especially the ones in this case ...
|
| And it used to be priests who had to deal with all the nasty
| confessions.
| Cumpiler69 wrote:
| Judges get loads of compensation and perks.
| siliconc0w wrote:
| ER docs definitely get PTSD. Cops too.
| apitman wrote:
| > surely banning social media or not moderating content aren't
| options
|
| Why not? What good has social media done that can't be
| accomplished in some other way, when weighed against the clear
| downsides?
|
| That's an honest question, I'm probably missing lots.
| yodsanklai wrote:
| Billions of people use them daily (facebook, instagram, X,
| youtube, tiktok...). Surely we could live without them like
| we did not long ago, but there's so much interest at play
| here that I don't see how they could be banned. It's akin to
| shutting down internet.
| smackay wrote:
| I expect first responders rarely have to deal with the level of
| depravity mentioned in this Wired article from 2014,
| https://www.wired.com/2014/10/content-moderation/
|
| You probably DO NOT want to read it.
|
| There's a very good reason moderators are employed in far-away
| countries, where people are unlikely to have the resources to
| gain redress for the problems they have to deal with as a
| result.
| Spooky23 wrote:
| In many states, pension systems give police and fire service
| sworn members a 20 year retirement option. The military has
| similar arrangements.
|
| Doctors and lawyers can't afford that sort of option, but they
| tend to embrace alcoholism at higher rates and collect ex-
| wives.
|
| Moderation may be worse in some ways. All day, every day, you
| see depravity at scale. You see things that shouldn't be seen.
| Some of it you can stop, some you cannot due to the nature of
| the rules.
|
| I think banning social media isn't an answer, but demanding
| change to the algorithms to reduce the engagement to high risk
| content is key.
| DocTomoe wrote:
| Doctors, cops, first responders, prison guards, soldiers etc
| also just so happen to be the most likely groups of people to
| develop PTSD.
| evertedsphere wrote:
| I think part of it is the disconnection from the things you're
| experiencing. A paramedic or firefighter is there, acting in
| the world, with a chance to do good and some understanding of
| how things can go wrong. A content moderator is getting images
| beamed into their brain that they have no preparation for, of
| situations that they have no connection to or power over.
| rrr_oh_man wrote:
| > A paramedic or firefighter is there, acting in the world,
| with a chance to do good and some understanding of how things
| can go wrong.
|
| That's bullshit. Ever talked to a paramedic or firefighter?
| sangnoir wrote:
| As other sibling comments noted: most other jobs don't have the
| same frequent exposure to disturbing content. The closest are
| perhaps combat medics in an active warzone, but even they
| usually get some respite by being rotated.
| gcr wrote:
| Burnout, PTSD, and high turnover are also hallmarks of suicide
| hotline operators.
|
| The difference? The reputable hotlines care a lot more about
| their employees' mental health, with mandatory breaks, free
| counseling, full healthcare benefits (including provisions for
| preventative mental health care like talk therapy).
|
| Another important difference is that suicide hotlines are
| decoupled from the profit motive. As more and more users sign
| up to use a social network, it gets more profitable and more
| and more load needs to be borne by the human moderation team.
| But suicide and mental health risk is (roughly) constant (or
| slowly increasing with societal trends, not product trends).
|
| There's also less of an incentive to minimize human moderation
| cost. In large companies, some directors view mod teams as a
| cost center that takes away from other ventures. In an
| organization dedicated only to suicide hotline response, a
| large share of the income (typically fundraising or donations)
| goes directly into the service itself.
| hnlmorg wrote:
| Frequency plus lack of post traumatic support.
|
| A content moderator for Facebook will invariably see more
| depravity and more frequently than a doctor or police officer.
| And likely see far less support provided by their employers to
| emotionally deal with it too.
|
| This results in a circumstance where employees don't have the
| time nor the tools to process.
| danielheath wrote:
| Trauma isn't just a function of what you've experienced, but
| also of what control you had over the situation and whether you
| got enough sleep.
|
| Being a doctor and helping people through horrific things is
| unlike helplessly watching them happen.
|
| IIRC, PTSD is far more common among people with sleep
| disorders, and it's believed that the lack of good sleep is
| preventing upsetting memories from being processed.
| bookofjoe wrote:
| https://news.ycombinator.com/item?id=42465459
| toomanyrichies wrote:
| One terrible aspect of online content moderation is that, no
| matter how good AI gets and no matter how much of this work we
| can dump in its lap, to a certain extent there will always need
| to be a "human in the loop".
|
| The sociopaths of the world will forever be coming up with new
| and god-awful types of content to post online, which current AI
| moderators haven't encountered before and which therefore won't
| know how to classify. It will therefore be up to humans to label
| that content in order to train the models to handle that new
| content, meaning humans will have to view it (and suffer the
| consequences, such as PTSD). The alternative, where AI labels
| these new images and then uses those AI-generated labels to
| update the model, famously leads to "model collapse" [1].
|
| Short of banning social media at a societal level, or abstaining
| from it at an individual level, I don't know that there's any
| good solution to this problem. These poor souls are taking a
| bullet for the rest of us. God help them.
|
| 1. https://en.wikipedia.org/wiki/Model_collapse
| efitz wrote:
| I have a lot of questions.
|
| The nature of the job really sucks. This is not unusual; there
| are lots of sucky jobs. So my concern is really whether the
| employees were informed what they would be exposed to.
|
| Also I'm wondering why they didn't just quit. Of course the
| answer is money, but if they knew what they were getting into (or
| what they were already into), and chose to continue, why should
| they be awarded more money?
|
| Finally, if they can't count on employees in poor countries to
| self-select out when the job became life-impacting, maybe they
| should make it a temporary gig, eg only allow people to do it for
| short periods of time.
|
| My out-of-the-box idea is: maybe companies that need this
| function could interview with an eye towards selecting
| psychopaths. This is not a joke; why not select people who are
| less likely to be emotionally affected? I'm not sure anyone has
| ever done this before and I also don't know if such people would
| be likely to be inspired by the images, which would make this
| idea a terrible one. My point is find ways to limit the harm that
| the job causes to people, perhaps by changing how people interact
| with the job since the nature of the job doesn't seem likely to
| change.
| marcinzm wrote:
| So you're expecting these people to have the deep knowledge of
| human psychology to know ahead of time that this is likely to
| cause them long term PTSD, and the impact that will have on
| their lives, versus simply something they will get over a month
| after quitting?
| efitz wrote:
| I don't think it takes any special knowledge of human
| psychology to understand that horrific images can cause
| emotional trauma. I think it's a basic due diligence question
| that when considering establishing such a position, one
| should consult literature and professionals to discover what
| impact there might be and what might be done to minimize it.
| throwaway48476 wrote:
| When people are protected from the horrors of the world they tend
| to develop luxury beliefs which leads them to create more
| suffering in the world.
| s1artibartfast wrote:
| I tend to agree with growth through realism, but people often
| have the means and ability to protect themselves from these
| horrors. Im not sure you can systemically prevent this without
| resorting to big brother shoving propaganda in front of people
| and forcing them to consume it.
| throwaway48476 wrote:
| I don't think it needs to be forced, just don't censor so
| much.
| s1artibartfast wrote:
| Isn't that forcing? Who decides how much censorship people
| can voluntarily opt into?
|
| If given control, I think many/most people would opt into a
| significant amount of censorship.
| kelseyfrog wrote:
| Conversely, those who are subjected to harsh conditions often
| develop a cynical view of humanity, one lacking empathy, which
| also perpetuates the same harsh conditions. It's almost like
| protection and subjection aren't the salient dimensions, but
| rather there is some other perspective that better explains the
| phenomenon.
| kdmtctl wrote:
| Just scrolled a lot to find this. And I do believe that
| moderators in a _not so safe_ country seen a lot in their
| lives. But this also should make them less vulnerable for this
| kind of exposures and looks like it is not.
| throwaway48476 wrote:
| Seeing too much does cause PTSD. All I'm arguing is that some
| people love in a fantasy world where bad things don't happen
| so they end up voting for ridiculous things.
| Eumenes wrote:
| What do you call ambulance chasers, but they go after tech
| companies? Cause this is that.
| quesomaster9000 wrote:
| The Kenyan moderators' PTSD reveals the fundamental paradox of
| content moderation: we've created an enterprise-grade trauma
| processing system that requires concentrated psychological harm
| to function, then act surprised when it causes trauma. The knee-
| jerk reaction of suggesting AI as the solution is, IMO, just
| wishful thinking - it's trying to technologically optimize away
| the inherent contradiction of bureaucratized thought control. The
| human cost isn't a bug that better process or technology can fix
| - it's the inevitable result of trying to impose pre-internet
| regulatory frameworks on post-internet human communication that
| large segments of the population may simply be incompatible with.
| gadflyinyoureye wrote:
| Any idea what our next steps are? It seems like we stop the
| experiment of mass communication, try to figure out a less
| damaging knowledge-based filtering mechanism (presently
| executed by human), or throw open the flood gates to all manner
| of trauma inducing content and let the viewer beware.
| noch wrote:
| > Any idea what our next steps are? [..] try to figure out a
| less damaging knowledge-based filtering mechanism [..]
|
| It should cost some amount of money to post anything online
| on any social media platform: pay to post a tweet, article,
| image, comment, message, reply.
|
| (Incidentally, crypto social networks have this by default
| simply due to constraints in how blockchains work.)
| smokel wrote:
| How will this help?
| bdangubic wrote:
| assumption here is the people posting vile shit are also
| broke&homeless?
| makeitdouble wrote:
| Reducing the sheer volume is still a critically important step.
|
| You're right that fundamentally there's an imbalance between
| the absolute mass of people producing the garbage, and the few
| moderators dealing with it. But we also don't have an option to
| just cut everyone's internet.
|
| Designing platforms and business models that inherently produce
| less of the nasty stuff could help a lot. But even if/when we
| get there, we'll need automated mechanisms to ask people if
| they really want to be jerks, absolutely need to send their
| dick picks, or let people deal with sheer crime pics without
| having to look at them more than two seconds.
| omoikane wrote:
| Possibly related, here is an article from 2023-06-29:
|
| https://apnews.com/article/kenya-facebook-content-moderation... -
| Facebook content moderators in Kenya call the work 'torture.'
| Their lawsuit may ripple worldwide
|
| I found this one while looking for salary information on these
| Kenyan moderators. This article mentioned that they are being
| paid $429 per month.
___________________________________________________________________
(page generated 2024-12-22 23:01 UTC)