[HN Gopher] Facebook Admits It Was Used to Incite Violence in My...
___________________________________________________________________
Facebook Admits It Was Used to Incite Violence in Myanmar (2018)
Author : ceilingcorner
Score : 357 points
Date : 2021-01-11 15:43 UTC (6 hours ago)
(HTM) web link (www.nytimes.com)
(TXT) w3m dump (www.nytimes.com)
| trident5000 wrote:
| Still in the app store I see.
| billfruit wrote:
| There was this "Facebook" riot in Bangalore in 2020, which
| resulted in death of 3 people in police gunfire.
|
| https://en.wikipedia.org/wiki/2020_Bangalore_riots
| tinalumfoil wrote:
| A lot of countries limit Facebook, and it makes sense that the
| attention on Facebook in the US is related to their US
| activities. India has responded to Facebook for their
| activities in India.
|
| https://en.wikipedia.org/wiki/Censorship_of_Facebook
| rogerdickey wrote:
| Boring. Facebook has been used for every type of communication,
| whether it's inciting violence or sharing cat pictures with
| grandma, by almost every internet-connected human.
| diveanon wrote:
| I volunteered at the Rohingya camps in Bangladesh while this was
| happening. Most of them were aware of this and attempted to stay
| ahead of the raids by monitoring some of the groups.
|
| Their stories are horrific and to this day I have bad dreams
| about what I saw / heard in those camps.
|
| One of my coworkers escaped through the jungle to Thailand where
| I now live. He lost his brother and father to a group with
| machetes.
|
| His story is incredibly depressing and makes me extremely
| grateful every day for what I have. We have spoken several times
| about the role of Facebook in what happened, and he has said
| repeatedly that nothing has changed and that it will happen
| again.
| lostlogin wrote:
| Have you or anyone you know made contact with Facebook about
| it? I'm not suggesting you should, I'm just wondering if anyone
| did and if there was ever any Facebook response that wasn't
| per-canned lawyer crap.
| diveanon wrote:
| I remember speaking to a few aid workers who were trying to
| get FB to donate to their efforts.
|
| To my knowledge they never received any money. If anyone can
| fact check that I would love to be corrected.
| tboyd47 wrote:
| Do you have any recommendations for charity groups working to
| lessen the plight of these people?
| diveanon wrote:
| There are several.
|
| A key thing to look for is whether or not the charity is
| focused on relocation. The conditions they were in while I
| was there were horrendous. The largest camp I visited was
| over 600k people.
|
| I was working with a few organizations, but from what I saw
| islamic relief was the most effective at getting people out.
| I believe most of the relocated refugees through that org
| wound up in aceh, indonesia.
|
| It's a humanitarian disaster that has been largely forgotten,
| I shudder to think what covid has done to their efforts.
| tboyd47 wrote:
| Thanks.
| flowerlad wrote:
| On the internet everything can appear equally legitimate.
| Breitbart looks as legit as the BBC. Sacha Baron Cohen
| https://www.youtube.com/watch?v=ymaWq5yZIYM
|
| Excerpts:
|
| Voltaire was right when he said "Those who can make you believe
| absurdities can make you commit atrocities." And social media
| lets authoritarians push absurdities to millions of people.
| President Trump using Twitter has spread conspiracy theories more
| than 1700 times to his 67 million followers.
|
| Freedom of speech is not freedom of reach. Sadly There will
| always be racists, misogynists, anti-Semites, and child abusers.
| We should not be giving bigots and pedophiles a free platform to
| amplify their views and target their victims.
|
| Zuckerberg says people should decide what's credible, not tech
| companies. When 2/3rds of millennials have not heard of Auschwitz
| how are they supposed to know what's true? There is such a thing
| as objective truth. Facts do exist.
| gjs278 wrote:
| lol you're the one spreading misinformation now. 2/3 of
| millennials have heard of auschwitz. you just made that stat up
| or are citing a false report with questions that don't
| explicitly ask that. you should be banned for fake news under
| your own system.
| anadem wrote:
| Thank you, I'd missed that and will share it
| fasdf1122 wrote:
| Time for all these SaaS companies to cut ties with Facebook.
| it wrote:
| And now they shut down Trump even though he called for a peaceful
| protest on Jan 6.
| Hnsuz wrote:
| Why Facebook not closed?? Why not deplatformed? Why not public
| shami g? Why others get all this?
| kerng wrote:
| Why is Parler shut down and Facebook not?
| guerrilla wrote:
| Because USians care much more about what happens in the US than
| in Myanmar.
| maxerickson wrote:
| No service that Facebook depends on decided to stop doing
| business with Facebook the way AWS decided to stop doing
| business with Parler.
|
| Parler built thier infrastructure in one place, with critical
| dependency on that provider. So when the provider booted them,
| they ended up having to shut the service down.
| kerng wrote:
| Apple and Google still have the Facebook app in their
| respective stores. Why not hold everyone to the same
| standards? Or are there differences I'm not seeing at a high
| level? Maybe sophistication/existence of content moderation
| is better?
| bllguo wrote:
| is this rhetorical? Clearly because Westerners don't care
| about Myanmar but do care about the US; there's nothing
| more to it
| maxerickson wrote:
| I don't have the answer to those questions. My other
| comment is purely mechanistic, you'd have to ask Apple why
| they have not booted Facebook.
| hikerclimber wrote:
| nice. i hope for anarchy all across the us and the world so we go
| back to the stone age.
| sschueller wrote:
| I guess you could argue Facebook should be shutdown as parlor
| was. Although parlor was only used for hate, Facebook didn't do
| anything to stop it either.
|
| I think a big issue is how facebooks algo (and others) are built
| for maximum profit which at the same time also radicalizes people
| in their filter bubble.
| [deleted]
| [deleted]
| marcusverus wrote:
| > Although parlor was only used for hate
|
| I encourage everyone to keep an eye out for these casual little
| calumnies, presented without evidence.
| tathougies wrote:
| Right. I used parler to do such things as follow EWTN. I
| stopped using Twitter for this sort of thing because I
| figured that -- with EWTN's stance on the trans issue -- it
| was only a matter of time before they were banned (there had
| been talk of banning JK Rowling for similar tweets at the
| time, so it's a reasonable fear). I joined parler to not have
| to worry about this. This constant belittling of anyone who
| does not completely toe the line of whatever ethical system
| has taken over Twitter to classify people like me -- a brown
| man with immigrant parents -- as white supremacists is scary,
| dystopian, and insane.
| jandrese wrote:
| As a counterexample you claim to have joined Parlor because
| Twitter wouldn't let you hate transsexuals openly? This is
| supposed to support the premise that Parlor had a use for
| something other than to amplify hate speech?
| notahacker wrote:
| > Although parlor was only used for hate,
|
| There's your difference. Facebook is a network for a couple of
| billion people to communicate over which _incidentally_ serves
| to amplify some of their hate. Parler is a niche network to
| provide a safe space for stuff too extreme for the likes of
| Facebook.
|
| (and no, Parler shouldn't be forcibly 'shut down' either, but
| it's unsurprising businesses don't want to transact with it)
| esja wrote:
| "Parlor [sic] was only used for hate"
|
| What does this mean exactly? All the content on Parler was
| hateful? Or something else?
| 9front wrote:
| Maybe "parlor was only used for hate" but Parler, no. There is
| questionable content on Parler as much as is on Twitter,
| Facebook, etc.
| thelean12 wrote:
| Parlor was created in response to hate moderation on other
| platforms.
|
| It's not too far off to say "Parlor was created as a space
| for hate". _Only_ used for hate is obviously wrong, but that
| doesn 't change much.
| yters wrote:
| unfortunately, "hate speech" seems to be defined as
| "whatever makes me upset" where "me" is a group accepted by
| tge platform
|
| e.g. there are prolife groups that say nothing negative
| against any people, but get banned for explaining the facts
| behind abortion and who does it
| safog wrote:
| Have you been on /r/The_Donald at all? Or are you just
| making whatever you'd like up?
| yters wrote:
| sure, they might be hate speech, but seems no worse than
| what i see on daily kos
|
| however, i am talking about an entirely separate group
| that goes out of its way to not say anything hateful and
| not condone violence due to tragic occurrences in the
| past, yet gets banned due to pointing out inconvenient
| and very uncomfortable truths
|
| if free speech cannot protect uncomfortable yet unhateful
| speech, it is worthless
| elbrian wrote:
| Could you provide a couple examples of the types of
| groups you're referring to?
| yters wrote:
| yes, prolife groups
|
| in the past abortionists have been shot due to what
| prolifers have said, so now they go out of their way to
| avoid anything hateful and violence inducing, yet they
| are getting deplatformed en mass in the fallout from 6
| Jan, which is entirely unrelated to their messaging
| [deleted]
| Nacdor wrote:
| > Parlor was created in response to hate moderation on
| other platforms.
|
| Again, not true.
|
| Parler was created in response to biased moderation of
| other platforms and was intended to be bipartisan.
| lazyjones wrote:
| > _Parlor was created in response to hate moderation on
| other platforms._
|
| Actually most people there seem to perceive it as an
| alternative to the _biased_ nature of Twitter. They didn 't
| want to get banned on Twitter for something that's being
| tolerated when done by left-leaning groups.
| rvz wrote:
| Even if you wanted to, Facebook can't be de-platformed. They
| are entrenched in the internet and own their own data centres,
| are also themselves domain registrars and members part of the
| ICANN.
| spoonjim wrote:
| Ultimately if someone with guns and tanks decides to
| deplatform you, it's happening unless you can get your own
| guys with guns and tanks to protect you.
| Uhhrrr wrote:
| Also, our economy might not be able to handle the spike in
| productivity if it were to go away.
| pmlnr wrote:
| Anything can be de-platformed.
| tootie wrote:
| The distinction is that Facebook are failing in their
| moderation efforts while Parler prohibits themselves from even
| trying. There's also the face that Facebook has such an
| enormous footprint of positive or neutral content. It would be
| like shutting down air travel because some planes crash.
| trident5000 wrote:
| Is this not an absurd standard considering section 230
| exists? Where the govt literally tells platforms to not
| moderate? From my understanding Parler has been taking down
| explicit calls for violence but the bar is just high.
| Jtsummers wrote:
| > Is this not an absurd standard considering section 230
| exists? Where the govt literally tells platforms to not
| moderate?
|
| Section 230 does not "literally [tell] platforms to not
| moderate". It removes some degree of liability for content
| not produced by the host itself.
|
| If I, on Twitter, make a false statement (libel) against
| someone, Twitter is not liable for it (I am). Now, Twitter
| _could_ remove that false statement (on its own or directed
| by a court as part of or after a suit against me by the
| injured party). Whether they _need_ to remove it in order
| to avoid liability would depend on various other
| circumstances. For instance, if my content is actually
| illegal (by some measure) then Twitter could actually be
| legally obligated under other rules. But they remain free
| of liability for the post itself so long as they
| (eventually) comply with the law and remove it. But if they
| permit the illegal content to remain, then they could
| _eventually_ become liable for it (per other statutes).
|
| Moderation is, as a consequence of the laws in place, both
| permitted and occasionally mandated.
| GavinMcG wrote:
| > From my understanding Parler has been taking down
| explicit calls for violence but the bar is just high.
|
| Amazon pulled the plug specifically because they were not
| doing so.
|
| https://www.gadgetsnow.com/tech-news/read-amazons-letter-
| to-...
| trident5000 wrote:
| You cant really be using a letter from the company that
| dropped them as a source. Of course they are going to say
| all of this. Twitter just recently let hang mike pence
| trend for a while until it was taken down. Did they act
| fast enough? This is all subject to interpretation. We
| just had a summer of riots that caused quite a bit of
| destruction, did they act fast enough there? Again, there
| isnt a science to this, its highly discretionary.
| spaced-out wrote:
| To be fair though, Facebook is now a lot more aggressive at
| removing radical political groups/content.
| papaf wrote:
| Facebook is not co-operating with the genocide investigation:
| https://thediplomat.com/2020/08/how-facebook-is-complicit-
| in...
| jimbob45 wrote:
| Should we shut down radio too, since radio is frequently used
| in guerilla warfare?
|
| The technology isn't the problem - the people feeling violent
| is. How very American is it that we want to immediately stamp
| out dissenting voices rather than actually solving the problem
| that these people disagree with (whatever it may be).
| MartinCron wrote:
| I don't really care to try to "solve the problem" that white
| supremacists have.
| imwillofficial wrote:
| And that's how things escalate.
| MartinCron wrote:
| So what is your alternative? Do we need to bend over
| backwards to appease the guys in "six million wasn't
| enough" shirts? How much political legitimacy should we
| give to people who are all-in on ridiculous conspiracy
| theories? How much attention to Holocaust deniers
| deserve? Do flat-earthers deserve equal time in geography
| class?
| jimbob45 wrote:
| I'm speaking more broadly. _I_ don't feel inclined to solve
| ISIS' problems either but they're not going to go away
| unless we sit down with them.
|
| Hardline stances seem nice until people get hurt and
| progress isn't made.
| AniseAbyss wrote:
| Parlor refused to moderate. Facebook at least makes an honest
| attempt to.
|
| And in case you haven't noticed governments across the world
| are stopping Facebook. Zuckerberg has been questioned in
| Washington and Brussels.
| MrMan wrote:
| I think Facebook should be broken up but I havent heard any
| good ideas yet regarding how to somehow regulate social
| networks.
|
| Beyond hate speech, I dont see any reason to limit what people
| can say. Somehow limiting how the network connects people, or
| how content propagates, seems like the key. Only FB has done
| the social experiments at scale to know how to engineer this,
| which is faintly ironic.
| arrosenberg wrote:
| > I havent heard any good ideas yet regarding how to somehow
| regulate social networks.
|
| * Break them up, and keep them small enough so that no one
| site can do that much damage.
|
| * Remove safe harbor provisions on algorithmic content feeds.
|
| * Pass a transaction tax on ad auctions to discourage free-
| for-attention business models.
| p_j_w wrote:
| 1. How do you plan on keeping them small enough? With
| mandatory federation and compliance with open standards,
| would this even be a good idea?
|
| 2. Sounds like a great way to encourage site moderators to
| bring down the ban hammer hard and fast.
| hinkley wrote:
| I recall when I was going through a bout of insomnia in
| college, being upset by the quality of the ads on late
| night TV. I had been taught about what was then known as
| targeted advertising, and I thought that meant that all
| those inane ads were being specifically targeted at
| insomniacs. Like they had decided I was some sort of moron
| who would fall for this stuff. Didn't take me too long to
| realize the low quality ads were simply targeting the low-
| quality time slots. Nothing more sinister than cheap people
| being cheap.
|
| But with real-time auctions, though, there are whole swaths
| of people whom the quality ads have decided are a waste of
| time, and now the hyenas can circle in to prey on
| everything that is left.
|
| There are some ad networks out there that try to avoid the
| real time auction and cross-site tracking aspects of these
| conventional ad networks. I don't know how successful
| they're being, but maybe they're on to something. Something
| that should be encourage by public policy.
| koolba wrote:
| Forcing them to split off Instagram and WhatsApp would be a
| fine start.
| snomad wrote:
| At a certain level of daily MAU, (say 100k?), every single
| post most be manually checked before becoming available to
| the public. They should be checked for a) child porn b)
| violence c) doxxing
|
| They are free to run algos to auto reject, sort whatever
| order.
|
| This would likely be too burdensome, and the socials would
| almost certainly have to start charging for access, which in
| of itself would probably remove much of the problem.
| lazyjones wrote:
| Here's a good idea: Poland's "freedom of speech" bill.
|
| https://polandin.com/51388314/justice-minister-announces-
| onl... Under its provisions, social media
| services will not be allowed to remove content or block
| accounts if the content on them does not break Polish law. In
| the event of removal or blockage, a complaint can be sent to
| the platform, which will have 24 hours to consider it. Within
| 48 hours of the decision, the user will be able to file a
| petition to the court for the return of access. The court
| will consider complaints within seven days of receipt and the
| entire process is to be electronic.
| lostlogin wrote:
| I've got a reflexive "why did they implement this?" and
| look for the downside. That law does does appear good, but
| Poland is on a dark path with it's politics and leadership.
|
| https://www.amnesty.org/en/countries/europe-and-central-
| asia...
| lazyjones wrote:
| It might be a matter of perspective. Poland has seen both
| Communism and Nazi occupation in recent history and can
| tell when suppression of free speech starts being
| harmful, on either side of the spectrum. We here in the
| West tend to be a bit biased.
| notahacker wrote:
| Poland also has elected officials instituting 'gay free
| zones', and arguing that whilst Poland is progressive
| enough to have legalised homosexuality in 1932, people
| waving rainbow flags and _talking_ about homosexuality
| are as dangerous as Nazis and Communists. Poland is not
| free from bias.
| lenwood wrote:
| If the goal is to avoid capricious removal of content then
| this is great. If the goal is to avoid the platform being
| used for harm then this seems insufficient. The law cannot
| move fast enough to stay current with social trends.
| unethical_ban wrote:
| It is a tough nut to crack. Social media as we know it today
| brings some benefits. I thank some aspects of "woke" twitter
| for making me realize the reality of several societal ills,
| pulling me out of the "theory only" political alignment that
| is Libertarianism (IMO). Letting everyday people, including
| the poor, minorities, and targeted groups, have a direct
| platform to share their story with the world, is a positive
| thing. In other words, good-faith usage of the platform.
|
| Clearly, there is flipside. Private Facebook groups to share
| hate speech or otherwise echo awful thoughts. Legions of bots
| controlled by political organizations or nation-states trying
| to divide and conquer a population. It is a losing battle to
| moderate these platforms.
|
| Facebook has no inherent right to exist as it does, or
| rather, make a profit as it does - and I wonder if extreme
| measures, such as a partial repeal of section 230, should be
| considered. Like all laws, small-p-progressive measures
| should be taken. Small and medium websites should not
| necessarily be held liable for content posted by others. But
| as a network grows to have hundreds of millions of users,
| perhaps they should?
|
| I'm not sure either.
| MrMan wrote:
| I am not on twitter, deactivate my FB accounts. Not a
| boycott, rather an attempt at quarantine. I want to remain
| friends with my friends at both ends of the political
| spectrum.
|
| I feel that if we collectively understood social graph
| theory better we could propose tweaks to make these
| networks more "fair" and by fair I dont mean with a respect
| towards a point of view, but rather less likely to create
| emergent, undesirable social phenomena like ISIS, genocide,
| white nationalist rebellion.
|
| But I have never seen any papers that explore these topics,
| on the other hand my interests have been elsewhere.
| fsflover wrote:
| > I think Facebook should be broken up but I havent heard any
| good ideas yet regarding how to somehow regulate social
| networks.
|
| I think a better idea is to force them to provide an API for
| interoperability, alternative clients and exporting the user
| data. Together with the federated social networks, it will
| change everything: https://the-federation.info.
| ForHackernews wrote:
| The best argument against this is that it means your data
| will potentially end up splatted across even more more
| different servers controlled by various unaccountable
| parties.
| fsflover wrote:
| How? Your data will only be on server(s) which you
| choose. I did not suggest that everyone can download
| everyone's data.
| ForHackernews wrote:
| If I want to chat and share posts with 3 friends who use
| 3 different federated nodes, my details end up on all 3
| of those nodes.
|
| Consider email as an example of a federated network: My
| name, email address, and text of my emails exist today on
| hundreds (thousands?) of different servers. If I decided
| I want to "delete" my name from the global email network,
| that's close to impossible.
|
| The same would be true of a federated social network, but
| with a more intrusive and personal set of data. Users
| will not understand that they cannot delete their own
| mastodon toots (or whatever it calls them).
| nipponese wrote:
| I don't know if this is the "right" thing to do, but I am
| certainly in favor of hearing more future-learning
| solutions versus pre-internet solutions that merely
| introduce friction to slow growth.
| biomcgary wrote:
| I think treating Facebook like a public utility and
| requiring federation is akin to breaking up Ma Bell(the
| telephone monopoly, see:
| https://www.investopedia.com/ask/answers/09/att-breakup-
| spin...). The cost to society of phone calls dropped
| dramatically. It took a long time to break up the monopoly
| since the government benefited (according to
| https://talkingpointz.com/the-breakup-of-ma-bell/).
| worker767424 wrote:
| Cambridge Analytica happened because of an API and users
| who will click anything to learn if they're a Samantha or a
| Carrie.
| fsflover wrote:
| No, it's not because of an API, it's because of the
| monopoly of Facebook. Without the monopoly they would
| value security of their users and would be afraid to
| loose them.
| bedhead wrote:
| "Beyond hate speech" lol. That's kinda the problem, isn't it?
| Who gets to define "hate"? And why ban speech that is
| actually legal (even if highly offensive) anyway? And if it's
| worthy of banning on a major platform, why not just make it
| explicitly illegal? These are serious and important questions
| (among others) that seem to get conveniently glossed over.
| rat87 wrote:
| > why not just make it explicitly illegal
|
| That violates freedom of speech
| bedhead wrote:
| Wow it's almost like the the founding fathers had a good
| idea when they drafted the bill of rights and the literal
| first thing that came to mind was freedom of speech.
|
| I'm sure this experiment of ripping up the constitution
| in our de facto new online lives that usurp what the
| government is capable of doing will go just fine...
| mratzloff wrote:
| Pick a point and draw a line in the sand. Then enforce it.
|
| So many HN replies amount to "we all agree this is a
| problem, but we can't fix the entire problem perfectly, and
| it has some hypothetical drawbacks, so we shouldn't even
| try."
|
| (Never mind that as a result of inaction in the face of
| disinformation and hate speech our societies are rotting
| from the inside, and many, many real-world atrocities are
| being carried out as a direct result.)
|
| This is, by the way, a fundamentally conservative
| viewpoint. Cf. gun violence, homelessness, living wage,
| etc. Just because something is a complex issue with
| imperfect solutions doesn't mean we have permission to do
| nothing.
| colejohnson66 wrote:
| The big problem is that if you draw a line, everyone is
| going to toe it and try to push past it. Trump has shown
| that he is willing to push the boundaries of what is
| acceptable his entire presidency.
|
| That's the "slippery slope" argument. If you define
| what's allowed, people will ask for more, and others will
| push past it saying it's not much different than
| previous.
|
| And besides that, the line _has_ been drawn many times by
| the Supreme Court. Hate speech is allowed by the First
| Amendment, but inciting violence may not be. There's
| "tests" for these sorts of issues that lower courts are
| supposed to apply.
| bedhead wrote:
| But now you're back to square one - who defines "hate"?
| That's the line you're talking about. Keep in mind mind
| that in many cases, some speech you consider "hate" is
| totally vague, and opinions will inevitably just fall
| along convenient ideological lines. SO, outside of some
| really explicit cases, it's really not definable at all.
| notahacker wrote:
| I tend to agree with this, but given we're discussing
| Myanmar here I think it's worth adding that knowing where
| to draw the line can get a lot more complex than deciding
| 'Hang Mike Pence' crosses it.
|
| Myanmar's language and culture are completely alien to
| people drafting Facebook policies, driving forces behind
| intercommunity violence include things like [likely at
| least partially true] news reports of other
| intercommunity violence and official government
| statements, and then there's nuances like Burmese people
| seemingly accepting the false claim the ethnically-
| cleansed Rohingya actually were Bangladeshi regardless of
| where they stand on other things, and the outpouring of
| support for Aung Sung Suu Kyi after Western criticism
| that might have been signals that they believed the
| conflict was the generals' doing rather than hers or
| might have been mass endorsement of the government's
| violence. I suspect my Myanmar-based Facebook friends'
| one or two allusions to burning villages and politicians
| are probably calls for peace and meditation, but
| honestly, I don't know.
| wolco5 wrote:
| The other side is facebook shouldn't offer a service to a
| country/people it can't support.
| nend wrote:
| >Who gets to define "hate"?
|
| I really dislike this argument. A lot of democratic
| countries have defined hate speech. In the US, individual
| companies define it and moderate it as they see fit. In
| other countries, their legislators and courts define it.
| The US has defined lots of difficult terms to define
| already.
|
| >And why ban speech that is actually legal (even if highly
| offensive) anyway?
|
| I mean if we're talking about why companies should ban it,
| I'm sure they have a variety of reasons that range from bad
| PR and loss of revenue, to the founders/owners/employees
| don't want to build a product that's used for violence.
|
| If we're talking about at the society level, because it
| threatens democracy, peace, individuals safety
|
| >And if it's worthy of banning on a major platform, why not
| just make it explicitly illegal?
|
| A lot of democratic countries already have. UK, Canada,
| France, Germany, New Zealand, Sweden, Switzerland (and yes,
| they managed to define it somehow):
| https://en.wikipedia.org/wiki/Hate_speech_laws_by_country
| bedhead wrote:
| If this speech threatens democracy, peace, and safety,
| why hasn't it been made illegal by the government, and
| how has the US managed to do so well for centuries with
| it being perfectly legal? Odd, the speech you're talking
| about being legal hasn't seemed to affect things much at
| all. Actually, we've done nothing but thrive despite hate
| speech being legal. Why haven't people been clamoring for
| decades to change the constitution because of all the
| mayhem caused by hate speech?
| nend wrote:
| >If this speech threatens democracy, peace, and safety,
| why hasn't it been made illegal by the government
|
| Because of the first amendment, and how the courts have
| very consistently interrupted it to allow hate speech.
|
| >and how has the US managed to do so well for centuries
| with it being perfectly legal?
|
| If by "do so well for centuries" you mean the US's
| economic output and world status over the centuries, I
| would argue that profiting off of Europe rebuilding
| itself after two world wars probably outweighed the
| detrimental effects of hate speech (among probably
| several dozen other reasons).
|
| If you mean "how has the US done so well handling the
| negative effects of hate speech for centuries without
| making it illegal", I would argue that hate speech has
| contributed to some of the most shameful and barbaric
| social dynamics over the centuries, and the US is
| historically well behind other modern countries on this
| front.
| bedhead wrote:
| I've noticed a high correlation between people who
| callously want to ban all sorts of speech and people who
| just seem completely miserable and think the US is the
| most awful place on earth.
| spaced-out wrote:
| >If this speech threatens democracy, peace, and safety,
| why hasn't it been made illegal by the government, and
| how has the US managed to do so well for centuries with
| it being perfectly legal?
|
| Because we've generally dealt with that by deplatforming
| that sort of speech socially and/or in the private
| sector.
|
| The history of the civil rights movement is filled with
| boycotts and other sorts of social pressure campaigns.
| crististm wrote:
| You don't get to enter into debate and get away with
| 'disliking' other's argumentation!
| nend wrote:
| That's typically the reason why I get into a debate.
| surge wrote:
| >A lot of democratic countries have defined hate speech.
|
| They have very poor subjective definitions that boil down
| to anything any group considers offensive which is a
| moving needle, and that make make things like satire and
| certain forms of comedy illegal and have a chilling
| effect on valid criticism. It's also compelled speech,
| and in some cases leaves violence as the only resort, as
| opposed to a conversation and de-conversion from
| extremist beliefs. We've seen abuse of it in several
| cases, if no one is offended, they'll create a group or
| pay someone to be offended. There's no burden of proof
| beyond someone's emotional state upon hearing the words.
| You can say something to two people of a group, one might
| think its funny and laugh, another might report you and
| call the cops.
|
| Also, turns out you can yell fire in a theater,
| especially if you at least believe it to be true, and
| that's something courts can't determine, lots of people
| say things that they believe to be true, that turn out to
| be false. Likewise, if speech is dangerous but true, its
| still should be protected.
|
| In either case, best this be settled in courts and
| legislation, not corporate meeting rooms that are echo
| chambers of opinion.
|
| >If we're talking about at the society level, because it
| threatens democracy, peace, individuals safety
|
| We already have laws against that, its why hate speech
| laws are usually redundant and likely to be abused and
| scope creep into silencing valid criticism of an
| individual or group of individuals who can then be
| offended and have you arrested, at the very least putting
| you through months and years of legal trouble before
| you're acquitted, and that's only if you can afford
| proper defense.
|
| You people need to look at history. It boggles me how
| uneducated people are today on the context of this issue.
|
| Even the person that did the Parlor leak is a
| Meiklejohnian absolutist.
| lovecg wrote:
| You talk a lot about theoretical outcomes, but isn't it
| reasonable to look at what different systems result in
| empirically? Which of the aforementioned countries had a
| violent assault on their seat of government in recent
| times? Does that support your argument, or maybe there
| are virtues to those alternative legal frameworks?
| nend wrote:
| >In either case, best this be settled in courts and
| legislation, not corporate meeting rooms that are echo
| chambers of opinion.
|
| I'm really confused by what your points is. You spend
| most of your post talking about how laws against hate
| speech are ineffectual because courts can't determine a
| persons emotions and beliefs, and then say it's best
| handled by laws and courts.
| MrMan wrote:
| lots of democratic countries have come up with workable
| definitions. I propose to make hate speech illegal, so
| banning it won't be in any way contradictory. I am not
| glossing over anything.
| wesleywt wrote:
| How would you define calling for the genocide of the
| Rohingya people? Would you define that as hate speech? This
| is an important question for right wingers to answer.
| Should we be tolerant of your intolerance?
| mc32 wrote:
| Yes it's a very difficult problem and one with bad
| solutions (see fictitious chatter below):
|
| I like Indian curry.
|
| Yes, Indian curry is the best curry.
|
| Yes, Indian curry in Japan is not the same. They don't know
| how to make it.
|
| Yes, Indians are better at making curry. After all it was
| invented in India and we have the local spices.
|
| Definitely, I would not buy curry inJapan unless it's made
| by an Indian who knows how to make it.
|
| I mean you can see this potentially going in an unwanted
| direction. But is it hate?
|
| To me hate is when you have immediate violent outcomes in
| mind.
| klmadfejno wrote:
| For real though, if you haven't tried Japanese curry, try
| to find some. It's really tasty (sweeter and not spicy)
| and not at all like Indian curry.
|
| (Indian curry is good too but most people have access to
| try it)
| mc32 wrote:
| Ha! I like Japanese curry too. But I meant Indian curry
| prepared in Japan. I did not mean Japanese curry.
|
| My point was that people can have discussions about who
| or what is better and it can diverge into other areas
| that could be considered hate by very sensitive people.
| Japan notoriously claimed for example that their snow is
| unique[1] and thus made it difficult to import European
| skis some time ago. Today this would be viewed and
| xenophobic or something when all it was was
| protectionism.
|
| [1] kinda true. It's 'wetter' than snow in some other
| places, but doesn't mean Rosignol should not be sold in
| Japan.
| bedhead wrote:
| "In mind" - what does this mean? Who exactly gets to
| define "in mind"? Like, you're now literally reading
| people's minds about their intentions? So I don't need to
| explicitly tell people to go do bad things, I can just
| say I'm upset about something and that's enough? I can
| make some bogus claim and that's enough? Because there
| aren't that many people spewing bullshit all day long on
| twitter? And let me take a wild guess on what your
| conclusion will be for people who say vague things on
| "your" side vs the "other" side...
| anigbrowl wrote:
| Endlessly peppering people with questions is sometimes
| called Sealioning. How about engaging with the point(s)
| made by the person you're replying to, and offering your
| own suggestions for how you'd go about things. Also, the
| HN guidelines (link at the bottom of the page) encourage
| you to respond to the strongest possible interpretation
| of what was said, rather than the worst.
| bedhead wrote:
| But that's the ENTIRE point here! This supposed itty-
| bitty exception - "Hey, just no _bad_ stuff, okay? " - is
| actually EVERYTHING. You say something like "Just no hate
| speech" or "Only if they have violence in mind" and those
| statements inherently violate the very notion of free
| speech the people are erroneously saying they're in favor
| of, and those innocent-sounding qualifiers are why things
| are devolving so rapidly. We've taken a simple concept
| that worked brilliantly for 250 years (free speech) and
| in the blink of an eye, now that online life has (for
| better or worse) usurped the government's role in setting
| rules for society, we're just rearranging how the game is
| played. This is the broader issue, online life (for lack
| of a better phrase, but I hope you get my point) has
| become so ubiquitous that it's like some sort of
| alternate society with a new governance. We
| wouldn't/couldn't insert these qualifiers into the
| constitution, but here we are giddily doing it for our
| new alternate world.
| newen wrote:
| That's not what sealioning is.
| mc32 wrote:
| That's the difficulty, isn't it? If I say, let's go and
| burn that police station down now! Let's be there at
| 9:15. It's pretty clear. If you say, I wish that police
| station would get burned down. I don't know. It's not a
| nice thought, but you're not actively working to have it
| burned down. On the other hand the wish does not have
| pure intentions.
|
| This is why I think it's impossible to "monitor" and
| purge or ban violent tweets or what have you.
|
| One, what is the intention of the speaker, two who is
| responsible for the audiences reaction? Do you take the
| most extremely negative interpretation? It cannot work.
|
| And that's not even taking slang into account where
| violent words don't mean violence and normal words can
| take on other meanings.
| wolco5 wrote:
| Context matters.
|
| "Let's go and burn that police station down now"
|
| If anyone I knew said that I would treat it as a dry
| joke.
| r00fus wrote:
| What's your point? I think the line is drawn when people
| are put at risk of life or liberty or have died/suffered.
| At that point, warning then excising those users and
| groups, reporting them to authorities (if crimes have
| been committed) is what's needed. And it needs to be
| timely.
|
| As I understand it Facebook, if it did anything at all,
| did id months/years after the fact, essentially doing CYA
| not anything meaningful.
| fairity wrote:
| I'd love Facebook to do more to prevent misuse of its platform,
| but I don't see why I should blame Facebook for Myanmar violence.
|
| Facebook at its core is a tool that helps people spread
| information and opinions - not too different from the telephone
| or email. The blame for Myanmar violence should be placed
| squarely on the people that are spreading misinformation that
| incites violence. Even if Facebook does a better job policing
| content, don't you think these perpetrators will find an equally
| effective technology to spread their propaganda? If not now, then
| in the future surely they will?
|
| Don't get me wrong, I do think Facebook should police clear
| misinformation that can lead to human suffering, but it seems
| like the media want to convince me to hate Facebook. I hate the
| people behind this misinformation. And, I would love Facebook to
| get involved, but I don't hate them.
| jeromegv wrote:
| Those arguments always forget 1 thing.
|
| Facebook recommend users to join groups, follow pages, and even
| shows content that is not from people you follow. So it isn't
| just a telephone company that allows you to make phone calls,
| it's a service that makes an editorial choice about what they
| show you.
|
| If they show you posts that incite violence in Myanmar and show
| you fake news that make you outraged about a certain ethnic
| group, it's possible that among a population of many millions,
| some people will decide to act on it. Remember, those people
| might never have been exposed to it, it's Facebook that
| recommended this content based on the engagement the posts were
| getting!
| jenwkejnwjkef wrote:
| I'm not sure why anybody falls for the frankly absurd idea
| that social media sites are platforms and not publishers.
| They choose what to show you, just like the NYT or any other
| traditional publisher. The only difference is who does the
| choosing---algorithms or humans. But that, to me, is just a
| logistical detail and shouldn't have an effect on the
| distinction between publisher and platform. To think that
| Facebook and other companies act as merely connectors between
| viewpoints and are really just platforms for free speech is
| ridiculous.
|
| From the article, no, Facebook did not "fail to prevent its
| platform from being used to foment division and incite
| offline violence;" they published content which foments
| division and incites offline violence. The distinction is
| important.
| MrMan wrote:
| I agree with this. FB has succeeded in their marketing
| effort to be perceived, even by their detractors, as
| somehow separate from "mainstream media" when in fact they
| are one of the biggest media companies on the planet.
| platz wrote:
| > Those arguments always forget 1 thing.
|
| > Facebook recommend users to join groups, follow pages, and
| even shows content that is not from people you follow
|
| See comment: https://news.ycombinator.com/item?id=25732255
| fairity wrote:
| > So it isn't just a telephone company that allows you to
| make phone calls, it's a service that makes an editorial
| choice about what they show you.
|
| Understood. Perhaps there's a more apt analogy. Let's say
| there exists a gun manufacturer who's developed a new type of
| gun that's particularly effective. The problem is the guns
| are in limited supply. The manufacturer ultimately cares
| about profit, so it decides to sell the guns to the highest
| bidder.
|
| Most of the guns sold end up providing value to society in
| the form of protection. But, in a handful of cases, the
| highest bidder happens to be a violent regime that uses the
| guns for ethnic cleansing.
|
| Do we blame the gun manufacturer or the violent regime?
|
| From this perspective, I agree that a small part of the blame
| falls on the gun manufacturer for not vetting buyers.
| MrMan wrote:
| lets say the gun manufacturer sees a presentation given by
| executives at a social network. "we can create conditions
| that drive demand, because our social scientists have 1/4
| the planet as guinea pigs." gun manufacturer takes out ads,
| and also gives money to "affiliates" like Ted Nugent to
| create viral content.
|
| I am not saying that FB execs make such a presentation, nor
| that TN is directly funded by the NRA, but the
| relationships are all there - interested parties are aware
| that Facebook can create results.
| matz1 wrote:
| >Facebook recommend users to join groups, follow pages, and
| even shows content that is not from people you follow. So it
| isn't just a telephone company that allows you to make phone
| calls, it's a service that makes an editorial choice about
| what they show you.
|
| But these all are algorithmic isn't it ? Its not like
| facebook decided to manually show certain content to some
| people, unlike the the silencing done to trump.
| chopin24 wrote:
| People develop algorithms. They aren't some neutral force
| of nature, like a tornado. These are choices.
| matz1 wrote:
| yes but is the algorithms include : spread only bad
| content ?
|
| I doubt so, the algorithms may be : spread content that
| increase engagement, without making explicit judgement
| whether the content is 'good' or 'bad'.
| chopin24 wrote:
| Intention is besides the point. The only thing that
| matters is impact. For example, in US case law, there is
| the concept of _Involuntary manslaughter_ : killing as
| the result of negligent or reckless actions.
| Karunamon wrote:
| That doesn't matter at the end of the day.
|
| Facebook employee decides what posts to show you -
| Facebook Inc is responsible for picking and choosing
| content (exercising editorial control) - Facebook is a
| publisher.
|
| vs
|
| Facebook employees write post engagement algorithm -
| Facebook computer program decides what posts to show you
| -> Facebook Inc is responsible for picking and choosing
| content (exercising editorial control) - Facebook is a
| publisher.
| matz1 wrote:
| >Facebook employee decides what posts to show you
|
| Here FB manually decide what constitute 'good' or 'bad'
| content. For example: fb employee decide trump content is
| 'bad' thus not allowing it.
|
| >Facebook employees write post engagement algorithm
|
| Here depending on how the algorithm works, it may not
| make a judgment whether the content is 'good' or 'bad' as
| long as it drive engagement. So here 'good' and 'bad'
| content can spread equally.
|
| Here if its algorithmic then trump content doesn't depend
| on whether someone decide its 'good' or 'bad' but merely
| the engagement metric.
|
| I would support the later, even if the content that
| finally show up is something 'bad' because here fb is
| merely act as tools.
| Karunamon wrote:
| The metrics they use for exercising editorial control do
| not matter for this line of logic. The fact that they are
| exercising editorial control, promoting some posts,
| censoring others, rather than neutrally carrying the
| content people wish to send (like a phone company or an
| ISP) makes them a publisher.
|
| The alternative is to allow these companies to hide
| behind "but we didn't do it, the algorithm did". The
| algorithm is an agent of the company and they are
| responsible for its behavior like any other employee.
| vuln wrote:
| Facebook manipulated user's emotions for science without
| user consent.
|
| https://www.forbes.com/sites/kashmirhill/2014/06/28/faceboo
| k...
| Fellshard wrote:
| I fall on the side of 'curation and modification of feeds
| represents creative work and therefore makes the platform
| responsible' for this very reason.
| ljm wrote:
| Yeah, I don't think Facebook, Google, Twitter, etc. get a
| free pass on this just because they automated the process
| and took humans out of the equation. They can't claim to
| not be responsible for their algorithms.
| MrMan wrote:
| yes it seems like the same stochastic behavior that drives
| marketing conversions - as numbers increase, expected
| conversion is almost surely going to happen. but regulation
| and complaints about the service always focus on the micro,
| the anecdote, the qualitative.
| mcphage wrote:
| > not too different from the telephone or email
|
| Facebook is a broadcast medium, so very different than the
| telephone. Email is somewhere in between--it works fine 1-to-
| many, but not so well many-to-many.
|
| > don't you think these perpetrators will find an equally
| effective technology to spread their propaganda?
|
| Maybe they will, but that's hardly a reason to ignore these
| atrocities. Maybe they won't. Maybe future technologies will be
| more careful about encouraging people to join extreme groups
| because it increases engagement.
|
| > it seems like the media want to convince me to hate Facebook
|
| Nobody's asking you to hate Facebook, but you should be aware
| of the power they posses, and how irresponsible they are being
| with that power. You don't need to bring your personal feelings
| into this at all.
| ng12 wrote:
| I agree with you 100%. It's why I don't like that Parler is
| being forced off the web.
| 99_00 wrote:
| I was surprised the article and headline didn't use the world
| genocide. But it looks like it is consistent with other NY Times
| articles from the time.
|
| This article was published (Nov. 6, 2018) after the UN official
| declared it a genocide. Yet, genocide doesn't appear in the
| article.
|
| UN official convinced of Myanmar Rohingya 'genocide' - March 12,
| 2018 https://www.cnn.com/2018/03/12/asia/myanmar-rohingya-un-
| viol...
|
| Another NY Times article from August 2018 also is reluctant to
| call it a genocide.
|
| >United Nations officials have raised the prospect that the
| violence could be considered genocide, and officials at the
| United States State Department have debated using the term,
| according to American diplomats.
|
| https://www.nytimes.com/2018/08/25/world/asia/rohingya-myanm...
| ndiscussion wrote:
| It's common for American media to minimize any genocide caused
| by the United States or it's corporations. Holocaust gets a
| pass because it has a convenient scapegoat.
| cm2012 wrote:
| From my understanding the Myanmar atrocities were mostly planned
| in private groups and chats by non notable figures, with private
| chain messages playing a big part.
|
| It's one thing to ban a politician inciting violence. Do you all
| want FB to monitor all your private messages for keywords, then
| block those messages?
| boomboomsubban wrote:
| From what I've read, the atrocities were largely carried out by
| the military, who also had hundreds of people making fake
| Facebook accounts to spread their propaganda.
|
| Monitoring private messages likely wasn't even possible as
| Facebook staff had just four employees capable of understanding
| the language.
| papaf wrote:
| I found the best modern link, with descriptions of UN activity,
| in an article in the Diplomat:
| https://thediplomat.com/2020/08/how-facebook-is-complicit-in...
| dmode wrote:
| WhatsApp is even more dangerous in developing countries, and is
| used to spread dangerous rumors at rapid scale. This has resulted
| in public lynching just based on false rumors
| MrMan wrote:
| WhatsApp is FB too
| j16sdiz wrote:
| Any unmoderated instant messager can be used to spread rumors
| at rapid scale.
| rootsudo wrote:
| Not just violence, genocide.
|
| Funny (not funny) how it's spun and minimized though.
|
| Also, paywall.
|
| https://www.bbc.com/news/world-asia-46105934
|
| from the BBC:
|
| "The network said it had made progress in tackling its problems
| in Myanmar but that there was "more to do".
|
| Last year, the Myanmar military launched a violent crackdown in
| Rakhine state after Rohingya militants carried out deadly attacks
| on police posts.
|
| Thousands of people died and more than 700,000 Rohingya fled to
| neighbouring Bangladesh. There are also widespread allegations of
| human rights abuses, including arbitrary killing, rape and
| burning of land"
| throwoutttt wrote:
| Don't let this distract you from grandpa posting mean stuff on
| Parler
| chmod600 wrote:
| Facebook and Twitter were also used for the Arab Spring.
| johncena33 wrote:
| I am from Bangladesh. I know this for a fact that FB has been
| used (and still being used) to incite religious extremism and
| hatred against religious minorities in my country. Social media
| has been one of the biggest cause behind the rise of religious
| extremism in South Asia in general. For more reference please
| see: https://en.wikipedia.org/wiki/2012_Ramu_violence.
| CivBase wrote:
| I believe Facebook has set a precedent it can never hope to keep
| up with - not so long as it keeps the algorithm on and designed
| to maximize engagement. It turns out terrible things are often
| very engaging.
| timothyduong wrote:
| Last time this was brought up HN user @Zackees debating about
| semantics whether this was a genocide or not.
|
| Would love to hear his perspective on this now that FB admits
| this. I guess it's 'technically' not 'genocide' so its all a-ok
| according to @Zackees, it's all the MSM's fault.
|
| https://news.ycombinator.com/threads?id=zackees
___________________________________________________________________
(page generated 2021-01-11 22:01 UTC)