[HN Gopher] Private censorship is not the best way to fight hate...
       ___________________________________________________________________
        
       Private censorship is not the best way to fight hate or defend
       democracy (2018)
        
       Author : Jimmc414
       Score  : 267 points
       Date   : 2021-09-29 13:56 UTC (9 hours ago)
        
 (HTM) web link (www.eff.org)
 (TXT) w3m dump (www.eff.org)
        
       | b215826 wrote:
       | Ironic that this is from the EFF, who were happy to dogpile on
       | RMS when he said some things that some people took offense to
       | [1]. Just like private censorship, trying to cancel someone is
       | not the best to fight hate.
       | 
       | [1]: https://www.eff.org/deeplinks/2021/03/statement-re-
       | election-...
        
       | faet wrote:
       | Banning hate communities does work though [0]. I assume that the
       | results are similar for twitter and facebook. "Hateful" Reddit or
       | FB communities also don't allow "free speech". The moderators
       | will ban people who go against the grain. There is no free
       | exchange of ideas or disputing as if you're going against the
       | grain you'll just get banned from that community (but not the
       | platform). As such, either the platforms allow the hateful
       | communities to just exist in their 'safe space silo' or they de-
       | platform them.
       | 
       | [0] http://comp.social.gatech.edu/papers/cscw18-chand-hate.pdf
        
         | BobbyJo wrote:
         | Define 'works'. This subject is an easy one in which to
         | conflate goals. The goals of the platform are different than
         | the goals of the public, lawmakers, etc. which is what the
         | article is about.
         | 
         | The goal of the platform is to pull in more users. Having less
         | abusive language and users is a means to that end, as is
         | appeasing general public outcries for more moderation.
         | 
         | The goal of the public and lawmakers is, hopefully, more
         | effective and meaningful discourse.
         | 
         | The paper you link measure for the prior outcome, not the
         | latter.
         | 
         | Edit: Just to comment on the outcome you are referring to
         | though, I don't believe the bans actually lead to more
         | meaningful and effective discourse on Reddit. I don't even
         | thing it lead to more civil discourse. You can click on
         | literally any politically sensitive topic on the front page and
         | you'll find that almost all the top comments both lean in a
         | specific direction (diversity of discourse is terrible) and the
         | deride anyone who disagrees. Maybe they use the 'F' word less,
         | and no one is using the 'N' word anymore, but it is very
         | obvious that it has become a strongly homogenized platform that
         | doesn't welcome nuanced opinions.
        
           | kitsunesoba wrote:
           | I suspect that for reddit specifically, homogenous opinion
           | and lack of meaningful discourse has more to do with the
           | voting/karma system and its inability to scale well.
           | 
           | Nuance gets crowded out because posts and comments presenting
           | polarized opinions are much more likely to get acted on by
           | readers than their more nuanced counterparts are. Posting and
           | voting becomes more about the dopamine hit from the gained
           | karma and sense of being correct than it is about discussion.
           | 
           | This is why while it still isn't perfect, one can often see
           | higher quality discussions in smaller niche subreddits where
           | there's no critical mass to push dominant opinions. In those
           | communities, posting extreme comments just makes one look
           | like an attention seeking troll.
           | 
           | So in my view, moderation or lack thereof is almost
           | orthogonal to an online community's propensity for quality
           | discourse. That seems more related to the designs of and
           | incentives given by the software these communities run on.
        
             | BobbyJo wrote:
             | I agree with most of what you've said. I really only
             | disagree with the last bit, as it ignores the network
             | effects of moderation, and the fact that moderation applied
             | by topic is _necessarily_ polarizing along the lines of
             | that topic.
             | 
             | I didn't intend to prove "bans cause homogenization", but
             | rather cast doubt on "bans achieve the stated goal".
        
           | majormajor wrote:
           | "More effective discourse" is probably far too aspirational a
           | goal for lawmakers/government. It's also probably too
           | subjective.
           | 
           | Less violence, on the other hand, is a clearer goal that
           | aligns with existing laws against violence, so it has a
           | firmer basis, in line with existing restrictions on speech.
           | It also gives you a non-partisan measuring stick.
           | 
           | Private platforms, on the other hand, will have more
           | flexibility in trying to maintain, say, discussion quality as
           | a goal (see HN's stated moderation goals vs Reddit's). Or
           | disinformation, or language/profanity, or whatever else a
           | private moderator may choose.
           | 
           | The government telling those private parties that they _can
           | 't_ set their own standards, though, seems like a
           | particularly terrible direction to go.
        
             | BobbyJo wrote:
             | It's no more subjective than 'less violence' provided the
             | interested parties define measures for success. I mean, if
             | the government can set goals for employment, then surely
             | they can set them for civic engagement and disinformation.
             | 
             | Totally agree with your last two points.
        
         | PerkinWarwick wrote:
         | >"Hateful" Reddit or FB communities also don't allow "free
         | speech". The moderators will ban people who go against the
         | grain.
         | 
         | Which is fine I think, but why have Reddit or FB do the
         | censoring (aside from things that are outright illegal). I
         | don't much care if a bunch of Nazis or tankies are busy
         | planning world domination on Reddit while sharing recipes. Why
         | do you care?
        
           | fighterpilot wrote:
           | > Why do you care?
           | 
           | Because zero moderation beyond criminal content is 8chan,
           | which arguably inspired mass shootings.
           | 
           | https://www.washingtonpost.com/technology/2019/08/04/three-m.
           | ..
        
             | PerkinWarwick wrote:
             | Zero moderation beyond criminal content is/was basically
             | Usenet alt groups. Somehow the world survived all those
             | years.
             | 
             | Listen, I understand the need to cordon off the wrongthink
             | people so that they can't communicate with each other, I
             | just don't agree with it.
        
             | notchFilter54 wrote:
             | And when you ban misinformation on your platform, that
             | "anti-vaxxer" instead goes to 8chan where by your argument
             | might inspire them to shoot people up.
        
               | joshuamorton wrote:
               | Given the size of 8 Chan and the size is communities
               | banned from Reddit, we know this to be untrue.
        
               | notchFilter54 wrote:
               | Given that Texas has greater population than Rhode
               | Island, we know it is untrue that my uncle Bill and his
               | friends upset with Texan policies moved from Dallas to
               | Providence.
        
               | fighterpilot wrote:
               | But you want Reddit to literally adopt 8chan's moderation
               | policy, meaning that Reddit now will become the place
               | that inspires mass shooters instead of 8chan (which by
               | the way, is no longer a place that inspires mass
               | shootings, since it was killed by Cloudfare after the
               | last one, and replaced with the impotent and unpopular
               | 8kun).
        
               | notchFilter54 wrote:
               | >But you want Reddit to literally adopt 8chan's
               | moderation policy,
               | 
               | I want reddit to adopt the public square's policy of
               | allowing any content that isn't illegal, which also
               | happens to be pretty much synonymous with 8chan's policy.
               | 
               | Do you consider the town square (which has the policy of
               | allowing content that is not illegal) a center of
               | inspiration for mass shootings? Could it be that the
               | public square is not viewed as a place for inspiration of
               | mass shooting, have anything to do with integration of
               | many ideas and the fact that someone bringing bad ideas
               | might actually be challenged in an environment where they
               | are exposed to the general ideas of the community rather
               | than an echo chamber of fellow nazis or whatever?
               | 
               | The nazi hall may have the same moderation policy as the
               | town square, that doesn't mean I expect the same
               | inspirations to come out of the nazi hall. The issue with
               | the nazi hall is the powder-keg full of people
               | reinforcing bad ideas, whereas a nazi in a more "normal"
               | place like the public square might have some chance of
               | being shamed or convinced their anti-social ideas are
               | undesirable (despite the nazi hall and public square
               | having same moderation policy). I don't want to shove
               | more people into the nazi hall by banning them from the
               | public square (especially when they're only being banned
               | from the square because they have unconventional views on
               | vaccines.)
               | 
               | ---------------
               | 
               | In the censor's world, the people with undesirable ideas
               | in the public square are kicked into 8chan where instead
               | of their ideas being challenged they all end up in a self
               | reinforcing chamber. The proportional amount of people
               | wanting a mass shooting may be tenfold that in the public
               | square, leading to more compressed exposure including by
               | other people who were originally just anti-vax or
               | whatever. And the people running the public square turn
               | around and say "see, 8chan allows any ideas, and that's
               | what happens when you do that!"
        
               | fighterpilot wrote:
               | "Do you consider the town square a centre of inspiration
               | for mass shootings"
               | 
               | Before the internet, yes, definitely. Maybe not mass
               | shootings specifically because that seems to be a recent
               | fashion trend after Columbine, but violent extremism in
               | general. How do you think Hitler managed to secure over
               | 40 percent of the democratic vote in the early 1930s? How
               | did Osama Bin Laden recruit extremists who were willing
               | to put a bomb into the WTC basement? Propaganda, speech.
               | 
               | This idea that unfettered speech in the public town
               | square, even if it isn't directly inciting violence,
               | can't lead to pathological outcomes just doesn't hold up.
               | 
               | This isn't even an argument for government censorship.
               | It's merely me recognizing that these type of outcomes
               | can come about.
               | 
               | Nowadays almost all extremist speech is online, because
               | that's where there is distribution and anonymity, so the
               | analogy breaks down.                 "where they are
               | exposed to the general ideas of the community rather than
               | an echo chamber"
               | 
               | This isn't a bad argument, but you have to balance it off
               | with the knowledge that ideas are highly, highly
               | contagious. On balance, I think giving such ideas
               | distribution to a billion eyes is far more harmful than
               | pushing a fringe into echo chambers which already existed
               | before social media censorship began anyway (such as the
               | Stormfront forum).
               | 
               | Moreover you have to recognize that these isolated echo
               | chambers would naturally self-segregate on Reddit if
               | given free reign, and so in practice you haven't changed
               | anything aside from giving these ideas more distribution.
               | It's not like /r/88 or whatever would be interacting with
               | the rest of Reddit thus helping their members
               | deradicalize.
        
               | notchFilter54 wrote:
               | I appreciate your honesty in believing the public square
               | is a center of inspiration for mass shootings.
               | 
               | I believe quite the opposite. It has been a place for the
               | public to plan self defense, both to organize themselves
               | in defense from natural disaster, hostile forces,
               | wildfires, and anyone who seeks to do them harm. It is a
               | place for the public to engage in the marketplace of
               | ideas and inspirations, which ultimately leads to the
               | saving of lives, prosperity, security, and bonding of the
               | populace. Harmful ideas can be shamed and those espousing
               | bad ideas have a chance of learning the holes in their
               | ideas. The mass shooter espousing violent ideas in the
               | public square is as likely to have alerted his neighbor
               | to be alert for any evidence of crime, as he is to
               | convince the general populace of his nutjob ideas.
               | 
               | I don't buy your hypothesis that Hitler came to power
               | because of free speech, and quite frankly it is laughable
               | to think banning Hitler from Reddit (were it to exist in
               | his day) would have any effect whatsoever. You seem quite
               | ignorant of the factors precipitating Naziism, including
               | the economic situation of Germany at that time. It's also
               | worth noting that Hitler was quick to stifle certain
               | speech that went against his ideas, meaning he found free
               | speech at odds or even dangerous to Naziism.
               | 
               | ---------
               | 
               | >How did Osama Bin Laden recruit extremists who were
               | willing to put a bomb into the WTC basement? Propaganda,
               | speech.
               | 
               | Bin Laden attempted to blow up the WTC basement with
               | bombs, not free speech. Bin Laden lived in Muslim nations
               | with more limited speech regulations than Reddit.
               | 
               | >Moreover you have to recognize that these isolated echo
               | chambers would naturally self-segregate on Reddit if
               | given free reign, and so in practice you haven't changed
               | anything aside from giving these ideas more distribution.
               | It's not like /r/88 or whatever would be interacting with
               | the rest of Reddit thus helping their members
               | deradicalize.
               | 
               | Some may, some may not. I've stopped using reddit because
               | I was banned because I simply said things like I didn't
               | believe forcefully shutting down a restaurant is an
               | appropriate way to deal with coronavirus. Now maybe that
               | is a very wrong and bad idea, but I'm willing to debate
               | with others on it and learn their perspectives. Instead
               | these communities said fuck you, you're banned, and now
               | you have to go to some echo-chamber where everyone agrees
               | with it. I'm not interested in an echo chamber, I'm
               | interested in engaging with others so my bad ideas can be
               | brought to light and shown to be bad, or my good ideas
               | can be integrated. Your argument sounds more like one
               | against having subreddits.
        
               | fighterpilot wrote:
               | Hitler convinced almost half the country to vote for him
               | because of speech that drummed up resentments stemming
               | from the Versailles Treaty and the depression, channeling
               | and anthropomorphizing those resentments towards Jews,
               | the lugenpresse, the military establishment, and so on.
               | So you've missed my point, which is that _town square
               | offline speech_ can directly cause pathological outcomes
               | when it is weaponized by bad faith actors.
               | 
               | The belief that sunlight is the best disinfectant is
               | nothing more than empty sloganeering and it flies in the
               | face of everything we know about social contagion and the
               | willingness of humans to be led astray by tribal hatred.
               | 
               | Town square offline speech didn't lead specifically to
               | mass shootings historically only because this particular
               | medium of terrorism is a modern fashion trend, so it
               | follows that it's a phenomenon that's going to be
               | motivated online more than offline in the modern context.
        
               | notchFilter54 wrote:
               | And your argument is that if the venues hosting Hitler's
               | speeches had Reddit's moderation policies then Hitler
               | would not have been elected?
        
               | fighterpilot wrote:
               | You're trying to draw analogies between modern technology
               | and the old town square. You should stop doing that
               | because instant distribution to a billion people isn't
               | the same thing as a speech to a thousand.
               | 
               | I provided examples of speech in the old town square
               | leading to pathological outcomes, but we are in a very
               | different regime now and analogizing too much isn't
               | helpful.
        
               | notchFilter54 wrote:
               | So who should decide what moderation policies we have for
               | the public? The general populace, who as you say would
               | elect literally Hitler, or the government itself of which
               | Hitler was once a part and used these very moderation
               | mechanisms to suppress the Jews? The tyranny of a
               | minority of special moderators like perhaps a nominally
               | communist censor committee may have? We allow Naziist
               | speech to exist precisely because we don't want the
               | government or the tyranny of the majority or minority
               | choosing what political speech is allowed, such as
               | outlawing speech that doesn't promote Naziism.
               | 
               | >You're trying to draw analogies between modern
               | technology and the old town square.
               | 
               | No I'm trying to find out how you want to apply
               | moderation strategies to "reduce the likelihood" (my
               | apologies if I misquoted your deleted comment) of
               | democratic election of those who some censors decide have
               | the wrong political views or speech.
               | 
               | >You should stop doing that because instant distribution
               | to a billion people isn't the same thing as a speech to a
               | thousand.
               | 
               | Are you also one of those that thinks the first amendment
               | doesn't apply to the internet because the founders never
               | imagined something that distributes so much faster than
               | the printing press could exist? I know this is a straw
               | man but I can't help but think this is where this is
               | leading.
               | 
               | >And your argument is that if the venues hosting Hitler's
               | speeches had Reddit's moderation policies then Hitler
               | would not have been elected?
               | 
               | The fact that you didn't answer this question (well you
               | did, but you deleted it) really is an damning answer of
               | itself.
        
               | fighterpilot wrote:
               | > So who should decide what moderation policies we have
               | for the public?
               | 
               | There's three possibilities:
               | 
               | (1) No moderation at all, beyond what's illegal.
               | 
               | (2) Private voluntary self-regulation.
               | 
               | (3) Government censorship.
               | 
               | In my opinion, (2) is the lesser evil, which isn't to say
               | that it doesn't have its own pitfalls. (1) is infeasible
               | due to the 8chan experience, and our understanding of
               | social contagion and human tribalism. (3) has a much
               | bigger slippery slope risk.
               | 
               | > The fact that you didn't answer this question
               | 
               | I deleted my answer because these analogies are too
               | tenuous. You're trying to compare modern social media
               | with how information spread 90 years ago. How can I map
               | "Reddit's moderation policies" onto 1920s beer halls and
               | Der Sturmer and newspapers? You can't do it. We're in a
               | new regime and we need to reason about this new regime
               | from first principles.
        
               | notchFilter54 wrote:
               | We're in agreement, although I might add (2) is
               | essentially the same as the censorship policy in the
               | Weimar Republic under which Hitler was elected, where
               | public censorship was nominally and constitutionally
               | illegal [1] (except in narrow circumstances, such as
               | anti-Semetic expression) and any censorship essentially
               | relegated to private and/or voluntary regulation
               | 
               | > How can I map "Reddit's moderation policies" onto 1920s
               | beer halls and Der Sturmer and newspapers?
               | 
               | The same way the first amendment is applied to both beer
               | halls and the internet. There's not a single rule in
               | Reddit's content policy that cannot be applied to a beer
               | hall [0]. If you fail to find a way to apply these rules
               | you're either not putting in any effort or you're a lot
               | dumber than you sound (methinks the former).
               | 
               | Given that what you advocate for is virtually identical
               | to that under the Weimar Republic, I assert your chosen
               | policies would have little to no effect on the election
               | of Hitler.
               | 
               | [0] https://www.redditinc.com/policies/content-policy
               | 
               | [1] Ritzheimer, Kara L (2016). 'Trash,' Censorship, and
               | National Identity in Early Twentieth-Century Germany.
               | Cambridge University Press.
        
               | [deleted]
        
               | PerkinWarwick wrote:
               | >Reddit now will become the place that inspires mass
               | shooters instead of 8chan
               | 
               | I can guarantee that there are plenty of evil doings on
               | Reddit and Facebook.
               | 
               | One argument, and a more honest one, that people can make
               | is that (a) social media is toxic and (b) it should be
               | made illegal generally. Bingo bango, no mass shootings I
               | guess.
        
               | [deleted]
        
         | mlindner wrote:
         | > Banning hate communities does work though
         | 
         | According to a single study. In general I don't think it
         | actually does. Making it harder to find is sufficient, banning
         | it outright just proves their point and pulls additional
         | moderates to their cause.
        
         | quantumBerry wrote:
         | If all "legitimate" platforms ban hate speech, then users
         | wanting to engage in hate speech will all go to some platform
         | that radically allows all speech with disproportionately this
         | undesirable speech. They will intermingle disproportionately
         | with those spreading sexual abuse images, drugs, insurgent
         | propaganda and instructional material, and other undesirable
         | material. Facebook likely makes the problem worse by forcing
         | these "hate speech" and "disinformation users" to be completely
         | surrounded by people with repulsive content, instead of having
         | their repulsive content critiqued and shamed by other users.
         | 
         | Having people with bad, hateful ideas out in the open I would
         | argue is preferable to concentrating all these bad thoughts
         | together with people that will reinforce that it is normal.
        
         | N00bN00b wrote:
         | I'm being censored on Reddit myself.
         | 
         | I'm not really sure why exactly. I suspect it has to do with
         | having a NSFW tag on my account (that I didn't put there) that
         | I'm afraid to turn off, because of the warning that's next to
         | that button and it doesn't explain what "swearing" is exactly
         | and if I turn it off and write "fuck" I might get banned.
         | 
         | My account provides a lot of help to other people, it's highly
         | valued by the community, I have loads of karma, I've paid for
         | quite a bit of server time by the rewards I've received.
         | 
         | But you can't search on my username, it won't show up (not even
         | if you have NSFW visibility turned on).
         | 
         | I'm not about politics, or covid or _any_ hate stuff. I 'm just
         | there to try and help people with depression, anxiety and other
         | issues. And I'm being censored for some reason.
         | 
         | So whatever they're doing, it's guaranteed to overreach and
         | it's counter productive and who knows how much damage it does
         | to society as a whole.
         | 
         | It's just that no one knows the severity of the damage the
         | censorship does.
        
           | colpabar wrote:
           | > I'm just there to try and help people with depression,
           | anxiety and other issues.
           | 
           | Then it's mostly like because those topics aren't "advertiser
           | friendly." Youtube did something similar during what was
           | called the "adpocalypse." Advertisers don't want their
           | products to be associated with the things you talk about, and
           | reddit cares more about selling ads than helping you help
           | people.
        
           | ModernMech wrote:
           | Reddit's search is incredibly (purposefully?) broken. It
           | won't give you nsfw results for _anything_. You have to go to
           | old.Reddit.com and use that search system. It has a checkbox
           | for "include nsfw results" that is missing from the new ui.
           | Check that and see if it helps you find yourself.
        
             | N00bN00b wrote:
             | I don't really need to find myself, you know. I know where
             | I am. The problem is more that others can't find me and
             | don't even know that I'm missing. Maybe you're right about
             | the reason, but even that's irrelevant.
             | 
             | It's scary. I happen to know this was done to me. How many
             | times have others made _you_ disappear without you knowing
             | you were scrubbed without being given any notice?
             | 
             | This isn't my first account on hackernews (not my second
             | one either, though with one exception they are all in good
             | standing).
             | 
             | But my very first account, I was happily participating here
             | for years, until someone kindly told me I had been
             | shadowbanned and I don't how much of my voice was censored
             | exactly and I don't know why.
             | 
             | So the issue isn't reddit. And apparently it's not really
             | about me. The problem is exactly what this article is
             | about, private censorship. Web3 is supposed to address
             | issues like this (though, I don't know how much of that is
             | handwaving and make believe). As soon as I can, I'm going
             | to move there.
             | 
             | I just can't trust these corporate entities, they don't
             | operate in good faith, aren't transparent and avoid
             | accountability for their actions. The utopia that was
             | promised to me has failed and is in full decay, the signs
             | are everywhere. It's time to move on.
             | 
             | It's sad that a "normal user" like me ended up with those
             | beliefs.
             | 
             | I just wanted to be left alone and allowed to voice my
             | opinions, I don't want to have to deal with censorship, I
             | don't want to have to tell others how I've been impacted by
             | it, to make them aware of what's happening behind their
             | backs. Instead, here I am having a discussion like this
             | with others.
             | 
             | Am I even allowed to say this? Will this be wiped as well?
             | I don't know. I wish this was all a joke. What kind of
             | dystopian future is this? It's ridiculous beyond belief.
        
           | eigengrau5150 wrote:
           | A friend of mine got permabanned from Reddit a couple of days
           | ago for calling _himself_ a faggot. Apparently reclaiming a
           | slur is hate speech over there.
        
           | rjbwork wrote:
           | I don't think I've ever seen an _account_ with an NSFW tag
           | /flair. Posts? Yeah. Subreddits? Yup. Possibly a flair for
           | your user on a particular subreddit added by a mod for some
           | reason? Sure. But not accounts. Can you link to this tag?
        
             | N00bN00b wrote:
             | Oh yeah. Accounts have NSFW tags these days. Also impacts
             | chat functionality. You'll get warnings there. They call it
             | "profile" but your profile is 100% of your account
             | activity.
             | 
             | https://www.reddit.com/r/help/comments/pfpw4y/any_way_to_re
             | m...
             | 
             | This link has pictures in it, showing you the NSFW tag:
             | 
             | https://www.reddit.com/r/techsupport/comments/8qhay9/why_is
             | _...
        
         | Manuel_D wrote:
         | To be clear, the study found that platforms banning topics
         | succeeds in removing those topics _from the platform_ - not
         | necessarily from society as a whole. The study did not conclude
         | that banning hateful communities off of reddit actually made
         | those users less hateful or curbed the spread of hateful
         | content online. If each of those banned users subsequently
         | posted twice as much hateful content on a different platform,
         | that still comports with the conclusions of that study.
        
         | [deleted]
        
         | leetcrew wrote:
         | there's at least one major reason to be skeptical of that
         | result, and they mention it in their limitations section.
         | 
         | what they did was create a lexicon of hate speech terms from
         | two large subs that were banned. they then counted the
         | frequency of those terms in other subs after the ban. they
         | found that usage of those terms dropped substantially, and
         | concluded that the bans were effective at reducing hate speech.
         | 
         | if you're familiar with the dynamics of these sorts of subs,
         | the problem with this approach should be fairly obvious. these
         | subs tend to develop their own set of specific terms/memes
         | (hate-related or otherwise). it may be the case that the bans
         | were effective at reducing hate speech across the whole site.
         | but it's also possible that the same people are still posting
         | the same stuff coded differently. this study is far from the
         | final word on the matter.
        
         | detcader wrote:
         | Neither of your assertions are related to what is being
         | discussed. Nuclear bombs "work" and bad people would do
         | horrible things with them, but that doesn't say anything about
         | if we _should_ use them.
        
         | hartator wrote:
         | > http://comp.social.gatech.edu/papers/cscw18-chand-hate.pdf
         | 
         | This paper is trash. On defining hate speech, "we focus on
         | textual content that is distinctively characteristic of these
         | forums", so, yes, banning specific subreddits resulted in less
         | of that content in the overall Reddit. Did it make Reddit a
         | more friendly place? Probably not as tensions have never been
         | higher.
        
           | Pils wrote:
           | > _Did it make Reddit a more friendly place? Probably not as
           | tensions have never been higher._
           | 
           | Did allowing hate speech make 8chan/Voat friendlier? There
           | are plenty of unfriendly communities that lack overt
           | racism/fatphobia, but I can't think of any friendly ones that
           | do.
        
             | NaturalPhallacy wrote:
             | Yes. People weren't trying to censor each other so they
             | were less pissed off.
        
         | Darmody wrote:
         | Then r/politics should be banned but it's one of the biggest
         | subreddits instead.
        
         | naasking wrote:
         | Sure it worked at reducing hate speech _on reddit_ , but is
         | that the right objective? It's not implausible to think that
         | the resentment from such local measures could cause hate to
         | increase in a more global sense. Kind of like entropy in
         | thermodynamics: local work can reduce local entropy, but at the
         | expense of causing a global increase in entropy.
         | 
         | If the _real_ objective should be a global decrease in hate,
         | then maybe local suppression /exile might not be the right
         | mechanism.
        
           | gremloni wrote:
           | The hate is in part generated by a continuous bubbled
           | feedback loop. Cutting out the source(s) usually doesn't lead
           | to a redirection and rather has a chilling effect.
        
             | naasking wrote:
             | A plausible hypothesis, but needs data.
        
               | gremloni wrote:
               | Pretty much how it played out with r/fatpeoplehate.
               | Anecdotal yes, but there are quite a few more examples on
               | Reddit alone.
        
           | chmsky00 wrote:
           | The goal is to reduce network effects.
           | 
           | As poetic as free speech is, the 1700s were a different time.
           | Hard to SWAT a rando a town away let alone a country. Now a
           | nation state can instigate localized terrorism.
           | 
           | Thomas Jefferson understood laws and constitutions must
           | change in step with human discovery.
           | 
           | Scalia wrote laws are not open ended, and up for public
           | romantic interpretation.
           | 
           | To paraphrase both. Indeed many Founders wrote of the need
           | for free speech in official political proceedings, never
           | considering the privilege extending to the public. In public
           | people were welcome to get a beating if they prefer to
           | violate local order.
           | 
           | Humans are biological animals first. Gentile by training.
        
           | simonh wrote:
           | I suppose to the extent these channels of communication
           | recruit and radicalise, if they are shut down they will not
           | be able to recruit and radicalise. The act of shutting them
           | down might infuriate regular users of these channels, but
           | they're already radicalised anyway.
        
             | burnafter182 wrote:
             | What you end up with _is_ recruitment. If the user remains
             | on Reddit or Twitter they 're exposed to the gamut of human
             | thought. However extreme they may be, they're still
             | attached, there may exist some sort of analogue for a
             | cleavage furrow, but nonetheless the cell remains. It's
             | only at the point where you've so alienated and shorn their
             | attachment to the whole that they become a fully
             | independent entity, and that they become truly radicalized.
             | And having observed this, I can say with sincerity that
             | they move deeper into the domain of extremity.
        
               | gremloni wrote:
               | The point someone is "recruited" is subjective. Either
               | ways cutting out new recruits is a great solution.
        
             | naasking wrote:
             | There is little evidence that radicalization happens online
             | [1]. It seems to require in-person contact to really take
             | root.
             | 
             | [1] https://www.rand.org/randeurope/research/projects/inter
             | net-a...
        
               | JasonCannon wrote:
               | I promise you from personal experience, radicalization
               | does happen online too.
        
               | xadhominemx wrote:
               | This "study" has been proven to be obviously and
               | demonstrably incorrect in the years since it was
               | published. The authors should retract.
        
               | fallingknife wrote:
               | source?
        
               | xadhominemx wrote:
               | Have you read the conclusions of the study? Do they
               | comport with events of the past 5 years? Not at all
        
               | thereddaikon wrote:
               | If you are going to claim a study is bunk then link
               | another study proving it. empty statements like that
               | prove nothing. that's not science.
        
               | xadhominemx wrote:
               | This study was not really science. They interviewed a few
               | people and then applied their own qualitative analysis to
               | conclude that people cannot be self radicalized fully
               | online. Since they published the study, there have been
               | several major terrorist attacks perpetrated by
               | individuals fully radicalized online.
        
               | fallingknife wrote:
               | I see. Your source is moral panicking in the news and
               | journalists psychic ability to determine that dangerous
               | radicalism is on the rise without ever bothering with any
               | sort of data collection. Who could argue with that?
        
               | Cd00d wrote:
               | I'd like to recommend this podcast about online
               | radicalization, produced by the NYTimes:
               | https://www.nytimes.com/column/rabbit-hole
        
               | Pils wrote:
               | That research was published by a libertarian think tank
               | in 2013. Since then, there have been numerous examples of
               | lone wolf terrorist attacks where the perpetrators
               | appeared to have no offline contacts with extremist
               | groups. See: New Zealand shooting, Pittsburgh shooting,
               | etc.
        
               | fallingknife wrote:
               | There were also lone wolf mass shootings before the
               | internet, so what does this prove?
        
               | blitzar wrote:
               | That radicalisation does not exclusively happen face to
               | face and that it can be conducted on line, by newsletter,
               | by book, by carrier pigeon.
        
             | inglor_cz wrote:
             | If ISIS can recruit in person by talking to vulnerable
             | people in the right mosques, surely other extremists have
             | backup channels as well.
        
               | gremloni wrote:
               | The more someone has to rely on backup channels and then
               | backups of the back channel, the less likely you enroll
               | new members.
        
               | simonh wrote:
               | But surely fewer channels means fewer recruits?
               | Ultimately it's about what kind of content these
               | platforms want to publish though.
        
               | inglor_cz wrote:
               | It might also mean "higher quality recruits".
        
               | gremloni wrote:
               | And that's okay. Fewer "higher quality" recruits are
               | easier to go after individually.
        
           | kongin wrote:
           | I can say that the local subreddit is one of the most heavily
           | policed with the result that during the covid pandemic a
           | large fraction of the casual posters were banned because they
           | couldn't keep track of what new rule was introduced in what
           | day.
           | 
           | A lot of people moved over to telegram channels because you
           | could actually ask questions without getting banned for
           | concern trolling, misinformation or incitement - e.g. asking
           | where protests were happening which is what got me banned
           | finally. Ironically I was asking where they were happening so
           | I could avoid them.
           | 
           | The result of that policing is that I now have a telegram
           | account and regularly scan a dozen right wing channels so I
           | know if I can buy groceries without getting tear gassed.
           | 
           | If this is what winning looks like for the left we don't need
           | help in losing.
        
         | londgine wrote:
         | > Banning hate communities does work though ...
         | 
         | The sentence is incomplete. Banning X communities does work to
         | achieve the goal of people not talking about X. I don't think
         | that your linked study is really necessary, the Chinese
         | cultural revolution worked really well (to achieve the goal of
         | "preserving Chinese communism") [0]. Imagine if 30 years ago
         | large digital monopolies banned what was considered
         | unmentionable back then. I doubt gay marriage would have been
         | legalised in America. All of the progress that we have in
         | making marijuana more legal would have been terminated by the
         | companies wanting to prevent people advocating illegal drug
         | usage.
         | 
         | [0] https://en.wikipedia.org/wiki/Cultural_Revolution
        
           | monocasa wrote:
           | > the Chinese cultural revolution worked really well
           | 
           | Given that the lineage in power now (the Dengists) were
           | imprisoned under Mao during the cultural revolution, and only
           | after Mao's death were able to perform a coup, arrest the
           | rest of the leaders of the cultural revolution, and let the
           | Dengists out of jail, I'm not sure that's the case.
        
           | da_chicken wrote:
           | Chinese censorship is backed by the threat of arbitrary
           | imprisonment or violence. In that case, it's not really the
           | banning of the topic that's working, it's policing for
           | compliance and de facto criminalization of defiance.
        
           | majormajor wrote:
           | Homosexuality in general _was_ effectively banned by dominant
           | platforms in the US for quite some time! It was a BIG DEAL to
           | people when gay characters started becoming more common in
           | mass media.
           | 
           | But note the order it happened. The privately-set standards
           | moved with the times much faster than the government ones
           | did!
           | 
           | Writers/tv execs/etc heard both the pro-equality arguments as
           | well as the anti-homosexuality arguments and made their own
           | choices as they were persuaded to. Many states, on the other
           | hand, never legalized gay marriage before the court decision
           | overruled their laws.
           | 
           | So that seems to show that we should empower private parties
           | to have control over what their platform shows, over either
           | the government or just the loudest mobs (there were MANY
           | protests/boycott threats/etc from religious groups over
           | this). The market gives this the advantage over the
           | government here: the private publisher can test what sells,
           | and over time is going to be increasingly forced to move with
           | societal changes, while the government is much more likely to
           | be captive to small-but-loud contingencies (especially in a
           | gerrymandered world).
        
           | Clubber wrote:
           | >Banning X communities does work to achieve the goal of
           | people not talking about X.
           | 
           | Honestly I'm not sure that is even accurate, I would imagine
           | it does the opposite. People are drawn to what's not allowed,
           | it's even one of the morals of the Adam and Eve story. Pretty
           | old but still relevant idea of human nature.
           | 
           | Here's an interesting story about Goldberger who defended the
           | Neo Nazis in Skokie.
           | 
           | https://www.aclu.org/issues/free-speech/rights-
           | protesters/sk...
        
             | inglor_cz wrote:
             | From a former Communist country: whatever was banned (jokes
             | about the Party or the Soviet Union, various conspiracies
             | or whatever correct-but-undesirable information out there,
             | such as the Chernobyl accident in the first days), spread
             | like wildfire by "whispering channels".
             | 
             | People are really drawn to forbidden fruit.
        
               | kongin wrote:
               | You need to understand that the people who are saying we
               | must ban misinformation are the party apparatchiks,
               | trying to argue in good faith with them is pointless. The
               | only reason to engage is to show the silent majority that
               | they aren't crazy for disagreeing with those in power.
        
         | majormajor wrote:
         | Yes, reforming users who already want to engage in hate speech
         | is not the goal; isolation is. Look at the pre-internet times,
         | certain forms of radicalization were far less common than they
         | are today because they aren't very present in the in-person
         | community (while those that WERE geographically concentrated,
         | like racism in certain American communities, were still spread
         | in those places).
         | 
         | When the status quo is bad, "maybe that won't work" is _not_ an
         | effective argument against taking action anyway if you don 't
         | have a better idea. If we don't take action, we already know
         | we're going to (continue to) have a bad result!
        
         | fallingknife wrote:
         | Also important to note that Reddit doesn't ban hate. They ban
         | unpopular opinions. If you want to see hate, go to r/politics.
        
           | bedhead wrote:
           | Yup. These platforms are all perfectly 100% fine with hatred
           | as long as it's the right kind.
        
             | [deleted]
        
         | R0b0t1 wrote:
         | They're measuring success based on referencing the community
         | they just banned people from, aren't they?
        
         | commandlinefan wrote:
         | > Banning hate communities does work though [0]
         | 
         | That's like saying banning smoking in the park reduces smoking
         | because nobody's smoking in the park any more.
        
           | bearcobra wrote:
           | I mean location based smoking bans definitely do reduce
           | smoking rates.
        
           | icyfox wrote:
           | These things do help. Most members of these communities get
           | radicalized by virtue of other content they're already
           | browsing. If you remove the communities and force individuals
           | to leap from Facebook to some no-name forum, they're far less
           | likely to engage.
           | 
           | This phenomenon is mirrored elsewhere as well where small
           | interventions can lead to big impacts: see suicide rates in
           | Britain after removing carbon monoxide in their house gas.
           | Initial friction (or means reduction in that research) can
           | drive change.
        
             | idiotsecant wrote:
             | I'm not convinced we should think it's acceptable for you
             | or I to decide who does and does not get exposed to
             | information that will 'radicalize' them because you and I
             | may have wildly different opinions of what constitutes a
             | radical opinion. Humans sometimes develop stupid opinions,
             | and that is OK. They're allowed to do that. They aren't
             | allowed to _act_ on those opinions in a way that
             | constitutes a crime, which is why we go through the trouble
             | of defining crimes in the first place.  'Precrime' thought
             | policing seems like a pretty dangerous road to go down and
             | one that is full of unintended consequences.
        
               | icyfox wrote:
               | I define a radical group to be some collection that seems
               | to be consistently peddling false material for their own
               | ends, with an overly provoking tilt that makes it easy to
               | go viral. Some specifics out of the past year would be
               | vaccine misinformation, prejudice that incites racial
               | violence in Myanmar, organizing an occupation to the
               | capital in the US. Shouldn't we put a stop to these
               | communities if they sway people to start peddling this
               | information themselves?
               | 
               | At least in the US, free speech is most free within
               | public forums. But even then we already define some
               | speech that is too dangerous if it's known to lead to
               | poor outcomes. You can't yell fire in a crowded room, not
               | because yelling fire is inherently a crime, but because
               | it's going to incite people to a detrimental ends.
               | 
               | Plus social networks don't have a constitutional
               | obligation to be town squares where free speech can
               | spread unfettered. You have to draw a line somewhere. And
               | as the people who write the algorithms that can amplify
               | or bury content on these networks, I think it _is_ our
               | obligation to at least set parameters on what constitutes
               | good/healthy content interactions on these social
               | platforms and what doesn't. The ML algorithms have to
               | optimize over some loss function.
        
               | throwaway946513 wrote:
               | Not that I disagree with you - but "can't yell fire in a
               | crowded room" is slightly misconstrued. As those aren't
               | the original words from the U.S. Supreme Court case. [0]
               | 
               | Additionally, the idea of 'clear and present danger' has
               | been modified within the past 100 years since the said
               | court case. The Supreme court since then has stated: "The
               | government cannot punish inflammatory speech unless that
               | speech is 'directed to inciting or producing imminent
               | lawless action and is likely to incite or produce such
               | action'". Said definition has changed and depends upon
               | situations, where some action is imminent or "at some
               | ambiguous future date".[1]
               | 
               | [0] https://en.wikipedia.org/wiki/Shouting_fire_in_a_crow
               | ded_the... [1]
               | https://en.wikipedia.org/wiki/Imminent_lawless_action
        
           | majormajor wrote:
           | A lot fewer people smoke now in the US than several decades
           | ago.
           | 
           | So let's do it. Let's isolate and restrict the hateful, the
           | angry, the irrational EXACTLY like we did smoking.
        
             | idiotsecant wrote:
             | This seems like a decent idea until someone in power
             | decides that you are hateful, angry, or irrational.
        
               | TulliusCicero wrote:
               | You could use the same reasoning like
               | 
               | > Making bad things illegal seems like a decent idea
               | until they make something you like doing illegal
               | 
               | Therefore just have no laws, I guess?
               | 
               | Sometimes an imperfect, abusable system is better than no
               | system. In fact, I'd wager that's usually the case.
        
               | AnimalMuppet wrote:
               | > Therefore just have no laws, I guess?
               | 
               | No. But have a system where some single powerful person
               | can't just make up laws on a whim.
        
               | TulliusCicero wrote:
               | Well, they generally can't: laws are written and passed
               | by a legislature.
               | 
               | You could make this sort of argument for executive
               | orders, though.
        
             | [deleted]
        
           | babypuncher wrote:
           | Maybe, but it does make the park a more pleasant space for
           | everyone else to spend time in.
        
       | nyx_land wrote:
       | The irony of private platforms censoring hate groups (for their
       | own PR I might add; let's not forget that YouTube's algorithm
       | famously favors alt-right content because weirdly enough
       | algorithms that favor "engagement" tend to end up favoring lowest
       | common denominator populist reactionary politics) is that it 1).
       | Legitimizes the claims of groups that otherwise might sound like
       | delusional paranoid conspiracy theorists, and 2). Forces these
       | groups to adopt decentralized alternatives that make them even
       | harder to do anything about. Tech companies engaging in
       | performative content moderation is really just creating selection
       | pressure that will ultimately create more resilient, technically
       | competent hate groups that will also have an increasingly
       | legitimate case to make for being persecuted by powerful
       | organizations.
        
       | cratermoon wrote:
       | Harmful speech and misinformation should not have a platform.
        
         | gorwell wrote:
         | Define `harmful` and `misinformation`.
        
           | cratermoon wrote:
           | https://news.ycombinator.com/item?id=28698642
        
         | chroem- wrote:
         | It's all fun and games until the Ministry of Truth refers your
         | case to the Ministry of Love.
        
           | cratermoon wrote:
           | - Performative Hate Speech Acts. Perlocutionary and
           | Illocutionary Understandings in International Human Rights
           | Law: https://www.researchgate.net/publication/334199177_PERFO
           | RMAT...
           | 
           | - Oppressive Speech: https://www.tandfonline.com/doi/full/10.
           | 1080/000484008023703...
           | 
           | - Why We Should Ban Oppressive Speech Acts:
           | https://publicseminar.org/essays/why-we-should-outlaw-
           | oppres...
        
             | chroem- wrote:
             | The speech in your post is oppressing me right now. Cease
             | and desist your verbal oppression or else I will use state-
             | sponsored violence to end your oppressively free speech.
        
               | cratermoon wrote:
               | Shouting 'fire' in a crowded theater has never been
               | protected speech.
        
               | mardifoufs wrote:
               | In the US? Yes it is. You realize you are quoting a 1919
               | supreme court case (that used the "fire in a theater"
               | argument to make protesting against the draft illegal)...
               | which has been overturned 50 years ago, right?
        
               | detcader wrote:
               | For a while now it's been understood that the "'fire' in
               | a crowded theater" thing is a misinformed meme
               | 
               | https://www.popehat.com/2015/05/19/how-to-spot-and-
               | critique-...
        
         | swagasaurus-rex wrote:
         | There is a platform. In private, where that sort of talk has
         | always existed.
         | 
         | When it's limited to private, it's obvious to the speaker that
         | holding those opinions is a personal choice, not a publicly
         | acceptable opinion.
        
         | drooby wrote:
         | Who defines what speech is harmful or wrong?
         | 
         | Democracy is a system in which _your_ party loses elections.
         | And when they lose, do you want them dictating what you can and
         | can't say?
         | 
         | No one has a monopoly on the truth.
         | 
         | In fact, our greatest scientific discoveries (the closest thing
         | we have to Truth) have been forged by "offensive speech". The
         | ability to offend actually helps minimize misinformation.
        
           | cratermoon wrote:
           | > Who defines what speech is harmful or wrong?
           | 
           | https://news.ycombinator.com/item?id=28698642
        
             | detcader wrote:
             | Let's trust the billionaire execs at Google, Facebook,
             | Amazon and Twitter to listen to the correct academics
             | rather than responding to the incentives of capital. When
             | faced with calls to ban pro-Palestinian rights activism on
             | their platforms, they've never caved before
        
             | rscoots wrote:
             | I notice those lists do not include the exact same
             | materials.
             | 
             | So you've proven the point that there is debate to be had
             | on these matters.
        
       | djoldman wrote:
       | Interesting document linked to from the article:
       | 
       | https://manilaprinciples.org/
        
       | supernintendo wrote:
       | Some of the comments in this thread seem intentionally bad-faith
       | or ignorant to how much hate and abuse actually exists on these
       | platforms. The Internet fucking sucks. I stopped using social
       | media because I'm trans and I don't feel like there's a place for
       | me online anymore. No matter where I go, including large
       | platforms like Reddit and Twitter, I'm inevitably subjected to
       | someone expressing their grievances about trans people or the
       | LGBTQ community. There's a part of me that wants to reply and
       | give my perspective, but it's like I can't even have a voice
       | online without the fear of people belittling and harassing me,
       | sending me abusive messages, trying to doxx me, telling me to
       | kill myself or scouring through my profile to try to find
       | whatever they can add to their "cringe compilation" or Kiwifarms
       | thread about how degenerate and disgusting trans people are. My
       | mental health is more important than participating in the
       | shitfest that is online discourse so I just avoid it. I post on
       | Hacker News and a few other places where people are generally
       | respectful, but other than that I've given up on having
       | conversations with strangers online.
       | 
       | I'm an artist and a software dev, I have a lot that I want to
       | share with the world but I don't think I'll ever get the chance
       | to. This world is cruel and these online platforms and social
       | media algorithms amplify that to the point where it feels like
       | the only way to win the game is to not play. Personally, I don't
       | feel one way or the other about online censorship at this point.
       | I think social media has ironically ushered in a culture of anti-
       | social behavior and maybe it's time to move on to something else.
        
       | duxup wrote:
       | I am wary of the situation developing.
       | 
       | But I'm also wary of this idea that "well you just give the
       | people tools to sort it out".
       | 
       | Mostly because:
       | 
       | 1. It seems unrealistic as expecting people to peruse their own
       | source code... That's a huge pain in the ass for most folks and
       | they're just as likely to pick a bad filter or service to do that
       | for them anyway.
       | 
       | 2. Did that change anything about how a situation where GoDaddy
       | and Google refused to manage the domain registration for the
       | Daily Stormer?
       | 
       | I don't think it did....
       | 
       | Aren't we back at the start with these suggestions?
        
       | tytso wrote:
       | It's not so simple as whether or not a platform allows content of
       | a certain type from certain authors to be published on their
       | platform. It's also about whether the platform is pushing that
       | content to others, using automated tools which have been tuned to
       | improve "engagement". That's some of the Facebook research which
       | was suppressed until the Wall Street Journal published the leaks
       | from that research. Facebook apparently knew that changes they
       | made were allowing posts that made people more angry to get
       | surfaced more, because angry people were more likely to comment
       | on the posts, thus improving "engagement". No matter if it caused
       | disinformation to be spread, or if it made people more angry, or
       | negatively impacted the mental health of teenagers. It improved
       | "enagement" and thus $$$ to Facebook shareholders.
       | 
       | This is also why there may be a problem with the truism "the best
       | way to counter bad speech is by allowing more speech". Well, what
       | if the engagement algorithsm cause the bad speech to get
       | amplified 1000x more than the speech which has objectively
       | verifiable truth claims? Free speach assumes that truth and lies
       | would be treated more or less equally --- but that's not
       | necessarily true on modern platforms.
       | 
       | So it's not only a question of whether people have the "right" to
       | publish whatever they want on a platform. Sure, you can stand on
       | a public street corner and rant and rave about whatever you want,
       | including "stop the steal", or "the world is about to end". But
       | you don't have the right to do that with an amplifier which
       | causes your speech to blare out at 100 decibels. Similarly,
       | platforms might want to _not_ amplify certain pieces of content
       | that are killing people by spreading misinformation, or
       | destroying democracy, or encouraging genocide. And that might
       | very well be the best thing platforms can do.
       | 
       | But now we have the problem that content can be shared across
       | platforms. So even if one platform doesn't cause information
       | about vaccines causing swollen testicles from showing up on
       | millions and millions of News Feeds --- what if that same
       | information, posted on one platform, is shared on _another_
       | platform which is much less scrupulous?
       | 
       | So for example, suppose platform Y decided not to amplify videos
       | that they decided was scientifically incorrect, because they
       | didn't want to have the moral stain of aiding and betting people
       | from killing themselves by not being vaccinated, or not allowing
       | their children from being vaccinated. But another platform,
       | platform F, which had done the research indicating this would
       | happen, but actively decided that $$$ was more important than
       | silly things like ethics or destroying democracy, might promote
       | content that by linking to videos being posted on Platform Y.
       | Maybe the best thing Platform Y could do would be to remove those
       | videos from their platform, since even though _they_ were no
       | longer amplifying that information, it was being amplified by
       | another platform?
        
       | BitwiseFool wrote:
       | 2020-2021 has shown me that _most_ people happen to be  'fair
       | weather fans' of civil rights. Everyone claims to love free
       | speech but when suppression of speech opposing their own
       | worldview comes into play, they suddenly dither and fall back on
       | other rationalizations to justify the censorship.
       | 
       | Well, it's not free speech being violated when it's 'hate
       | speech'. Well, it's not free speech being violated when it's
       | 'misinformation'. Well, it's not free speech being violated when
       | it's a private company doing the censorship. Et cetera. I'm sure
       | you've seen you're own examples of this.
       | 
       | It's pretty disheartening, but enlightening nonetheless. I have a
       | much better understanding of historical moral panics and cessions
       | of freedoms. Whereas I used to wonder how some societies ever
       | gave into such pressures, I now realize it's not that hard to
       | persuade the average citizen into accepting such things.
        
         | frellus wrote:
         | 100% of everything you said I agree with. We're in a downright
         | recession on freedoms.
         | 
         | "I now realize it's not that hard to persuade the average
         | citizen into accepting such things."
         | 
         | Yes, and worse I feel it's not even that people are all too
         | willing to go along with these things from on high, they want
         | it, they propagate it. I'm starting to see the same behavior
         | and mentality I imagine was common in East German where
         | citizens police other citizens. It's not a good direction we're
         | in at all.
        
         | BongoMcCat wrote:
         | As a European, I mostly just find it interesting that americans
         | have such an extremely black/white view of free speech.
        
           | jaywalk wrote:
           | As an American, I find it interesting that Europeans seem so
           | eager to have government restrict speech.
        
             | inglor_cz wrote:
             | As a European who likes the American First Amendment,
             | people everywhere generally gravitate to status quo.
             | 
             | But youngish generations across the pond seem to favor
             | government restrictions on speech in much higher numbers
             | than before:
             | 
             | https://www.pewresearch.org/fact-tank/2015/11/20/40-of-
             | mille...
             | 
             | I am afraid that the U.S. might one day flip too.
        
         | exporectomy wrote:
         | Yep. Every western country I know of except America, has only a
         | token gesture of human rights and they're only rights that
         | happened to match the culture at the time with no longer term
         | vision at all.
         | 
         | New Zealand, for example has its Bill of Rights Act which says
         | no forced medical procedures. Somebody was recently fired for
         | refusing a Covid vaccine and went to court arguing that right.
         | The judge said, yea, there is that right, but also the
         | government can revoke it whenever they feel like for any reason
         | they want, and they have, so no luck.
         | 
         | Discrimination based on race or sex? Certainly not! Oh, except
         | for hiring domestic workers, selecting flatmates, decency,
         | safety, sports, etc, etc. In other words, all the places where
         | people were already doing discrimination.
         | 
         | The UN's ambitiously named "Universal Declaration of Human
         | Rights" has 28 rights for people, and a 29th right of
         | governments to deny any of those rights for reasons of
         | morality, public order, general welfare of democratic society.
         | In other words, any reason whatsoever.
         | 
         | Remember freedom of internal travel? China used to be a human
         | rights violator for restricting that. Now every country and its
         | dog is doing the same. But this time it's "us" not "them", so
         | it's all OK.
        
         | fighterpilot wrote:
         | Should YouTube censor ISIS propaganda or not?
         | 
         | If yes, how is this not a violation of the principles you've
         | laid out?
        
         | farias0 wrote:
         | Although I don't fundamentally disagree with your point, I
         | don't think you're framing the issue fairly. It's not a matter
         | of most people being enemies of free speech, it's just that
         | this is an inherently difficult problem. Everyone draws the
         | line differently on what's acceptable and what isn't, and every
         | platform is trying to foster a different kind of community.
        
           | frellus wrote:
           | The moment that any large number of people start pushing the
           | point that ideas or speech they disagree with is actual
           | violence, we're all a bit doomed.
        
           | foobarian wrote:
           | I wonder if we would be better served with a NPR-like
           | publicly funded platform for video hosting that puts a lot
           | more resources into content moderation. The private platforms
           | get away with the bare minimum by throwing black box AI at
           | the problem which leads to problems like the anti-vax
           | censorship chilling effects etc. There should be an easily
           | reached human in the loop with transparent decisions, and
           | levels of appeal which is much more expensive. Kinda like the
           | court system with the jury-of-your-peers litmus test.
        
             | throwaway894345 wrote:
             | Sadly, NPR is a bad example because they are _hardly_
             | publicly funded and their content is _pretty_ biased (I 'm
             | a moderate liberal and NPR definitely feels like it's left
             | of me, even if they're typically _more civil_ than other
             | media outlets).
        
               | LudvigVanHassen wrote:
               | Agreed. NPR it VERY left biased and it's damn evident.
        
             | godshatter wrote:
             | If the state's going to host it, then their bar should be
             | whether it's legal or not. If they want to put scary
             | warnings around certain content or lock some of it behind
             | age restriction, sure, fine. But if they are going to use
             | public funds to host a government run publicly available
             | video sharing platform, they should be very cautious about
             | removing any content that doesn't violate actual laws. Free
             | speech and all that, if anyone still remembers the concept.
        
               | throwaway6734 wrote:
               | Doesn't the state already ban nudity/language on public
               | broadcast, film, and radio?
        
               | retrac wrote:
               | Specifics depend on the country. In the USA, over the air
               | broadcast is restricted in content on the premise that
               | broadcast spectrum is a limited public resource, and that
               | you don't have much choice with what you see when you
               | tune in. That argument gets pretty weak with point-to-
               | point networks with nearly unlimited bandwidth, as I see
               | it. An analogy might be the difference between ads with
               | nudity on billboards (I believe that can be prohibited in
               | the USA?) and ads with nudity in a print magazine going
               | to a subscribership expecting such ads (protected by the
               | 1A, including for mailing through the USPS).
               | 
               | Public libraries are perhaps another source of analogy.
               | My local library system has some of the most vile and
               | objectionable works ever printed on the shelves due to
               | popular demand. Many public libraries in Canada and the
               | USA are quite absolute about that with regards to free
               | expression. For example:
               | https://www.cbc.ca/news/canada/ottawa/porn-library-
               | ottawa-po... "Library patrons allowed to surf porn,
               | Ottawa mom discovers"
        
               | foobarian wrote:
               | Yes, exactly. And deciding legality should be done better
               | than current automatic moderations that punish innocent
               | content without a working appeals process.
        
               | cyberge99 wrote:
               | The USA used to have a Fairness Doctrine. It required
               | political viewpoints to have equal air time.
               | 
               | It made politics much more boring.
               | 
               | As it should be.
        
               | dragonwriter wrote:
               | Er, politics in the period of 1949-1985 was not,
               | generally, "more boring" than 1985 to present. The last
               | few years _maybe_ have achieved the undesirable level of
               | not-boring that was generally the case through most of
               | the 1950s, 1960s, and much of the 1970s, but certainly
               | overall the post-Fairness Doctrine period was more boring
               | than the Fairness Doctrine period. (The Neoliberal
               | Consensus is probably a bigger factor than the FD on
               | that, though; it's pretty hard to attribute much of
               | anything about the overall tenor of politics to the FD.)
        
               | bdamm wrote:
               | Requiring that content be sourced from actual humans in
               | good faith, and identifying people violating terms of
               | service by spamming with puppet bots would be a good
               | start. If the service is being operated by a national
               | government, then you could require posters to prove
               | residence or citizenship. But would anyone really want to
               | use such a service? Could it possibly be even produced or
               | operated efficiently? Once you put strict legal
               | requirements on the operating entity it will get very
               | slow, expensive, and user unfriendly.
        
             | 1cvmask wrote:
             | NPR censored coverage of the Hunter Biden laptop claiming
             | it was not newsworthy. All platforms with any moderation
             | will be censorious platforms by definition. You can always
             | tweak the degree of censorship though with moderation.
        
           | jaywalk wrote:
           | > It's not a matter of most people being enemies of free
           | speech
           | 
           | I disagree. If you believe in censoring opinions and ideas,
           | you do not believe in free speech. Period.
        
             | joshuamorton wrote:
             | What makes something an opinion or an idea versus a threat
             | or an obscenity?
        
               | throwaway894345 wrote:
               | Intent. If your speech is intended to compel someone
               | (compulsion != persuasion), then it's a threat. Of
               | course, accurately assessing intent is difficult because
               | threats are often implied rather than explicit (precisely
               | because the one issuing the threat wants to avoid the
               | consequences of issuing a threat).
        
               | jaywalk wrote:
               | "Someone should kill (some public figure)" - opinion/idea
               | 
               | "I'm going to kill (some public figure) tomorrow at (some
               | event)" - threat
        
               | b3morales wrote:
               | But then cf. "Will no one rid me of this turbulent
               | priest?"[0]
               | 
               | [0]:https://en.wikipedia.org/wiki/Will_no_one_rid_me_of_t
               | his_tur...
        
               | jaywalk wrote:
               | Not sure how that has anything to do with free speech.
        
           | BitwiseFool wrote:
           | I don't think people are enemies, adversaries, or even
           | detractors of free speech, but rather, they won't actually
           | defend the kinds of speech that the ideal of free speech is
           | meant to protect. Especially when the censorship happens to
           | be affecting their partisan opposite. I do, though, recognize
           | the difficulty in allowing some things and not others
           | depending on the forum.
           | 
           | Please excuse the shortcoming I have for explaining this, but
           | I have to fall back on "I know it when it see it" when it
           | comes to what counts as violations of the principle of free
           | speech vs content moderation. In the current zeitgeist
           | though, I absolutely see this as censorship rather than
           | content moderation. This is because I absolutely sense
           | partisan motivations for content take-downs and topic-wide
           | bans on such large platforms as YouTube, Twitter, Reddit,
           | etc.
        
         | wutbrodo wrote:
         | > 2020-2021 has shown me that most people happen to be 'fair
         | weather fans' of civil rights.
         | 
         | This is always how it's worked. The idea that protecting the
         | rights of your enemies can be salutary relies on too many
         | complex concepts for the majority of people to have any hope of
         | grasping it: burning the commons, collective action, norm
         | evolution, meta-level thinking[1], modeling counterfactual
         | worlds (eg one a future where your favored ideology is not
         | dominant and is in need of the protections you are currently
         | burning down).
         | 
         | The average person is nowhere near smart enough to be able to
         | put these pieces together into a coherent worldview, let alone
         | one that they find more convincing than "they're the enemy,
         | crush them". The periods where liberalism has been resurgent
         | are not ones where the masses are suddenly enlightened, but
         | ones in which they either have little power or are pacified by
         | unrelated conditions. This is not unlike the conditions in
         | which dictatorships are stable, as the common thread is simply
         | "the masses can't or don't care in detail about the
         | fundamentals of the way they're governed". It's not a
         | coincidence that the global illiberalism surge coincides with
         | the rise of universal connectivity: On top of the social and
         | economic upheaval that it induced, suddenly large amounts of
         | people can coordinate epistemically, through hashtags and
         | reshares, without making their way through distribution
         | chokepoints controlled by elites.
         | 
         | [1] I couldn't think of a concise way to phrase this, but I'm
         | referring to the tendency to claim that a big chunk of your
         | beliefs/preferences are incontrovertible and fundamental tenets
         | of society while others' are simply their beliefs and
         | preferences.
        
           | Kim_Bruning wrote:
           | I'm not sure people can't grasp it. But it certainly can't be
           | taken for granted.
           | 
           | Perhaps we need to continuously keep making versions of this?
           | https://www.youtube.com/watch?v=vGAqYNFQdZ4 ("Don't be a
           | sucker" 1945)
        
             | wutbrodo wrote:
             | Yea, I hear you, that's certainly possible. But I'm not
             | reasoning top-down from liberalism's unpopularity, but
             | rather bottom-up from the inability most people have to
             | grasp the components I mentioned. Some of them are much
             | simpler concepts than liberalism, and are applicable in
             | many cases that people have more direct stakes in. And yet,
             | in my experience, vanishingly few people are capable of
             | grasping them to any reasonable degree.
             | 
             | > Perhaps we need to continuously keep making versions of
             | this? https://www.youtube.com/watch?v=vGAqYNFQdZ4 ("Don't
             | be a sucker" 1945)
             | 
             | We're drowning in anti-racist and anti-fascist material. I
             | don't think it's particularly relevant: clearly it's not
             | difficult to nominally support these beliefs while still
             | being extremely illiberal.
             | 
             | That's what's so insidious about illiberalism. It can (and
             | does) poison any ideology, even "good ones", because once a
             | sufficient number of people are bought into it, it hardens
             | into dogma and leads easily to "why should we protect those
             | that disagree with our holy belief system?".
             | 
             | To use an example that we have no trouble recognizing as
             | illiberal, from our modern perspective: Christianity on
             | paper is a very liberal tradition, full of exhortations to
             | love thy enemy and spread peace and love. I don't doubt
             | that this deeply resonated with many early converts. But
             | after a thousand years of spreading to the masses and
             | ossifying into institutions, the medieval Church resembled
             | every other illiberal institution: here's the dogma, and if
             | you don't like it, well then you'll love my torture
             | dungeon.
        
         | censorshipirony wrote:
         | There's great irony to posting this on HackerNews. It's one of
         | the most gung ho pro-censorship sites on the internet. The mods
         | will pull anything that doesn't align with their politics or
         | YC's profit motive. They'll claim it's under the guise of
         | "intelligent discussion", which is an even weaker justification
         | than that offered pretty much everywhere else. This site has a
         | lot of discussion about free speech but what's allowed here is
         | a very narrow and pro-YC subset of conversation.
        
         | mcguire wrote:
         | Yeah, most people tend to get squidgy about abstracts like
         | "free speech" when the crazies come out of the woodwork and
         | start calling for treason, mass deaths, and so on.
        
         | BoxOfRain wrote:
         | >I have a much better understanding of historical moral panics
         | and cessions of freedoms. Whereas I used to wonder how some
         | societies ever gave into such pressures, I now realize it's not
         | that hard to persuade the average citizen into accepting such
         | things.
         | 
         | Me too. I've always wonders what drives people into the arms of
         | fascists and other such overtly unpleasant belief systems and
         | the simple answer is fear. Fear for yourself, your family, and
         | your future will apparently cause people to abandon all sorts
         | of supposedly cherished beliefs.
         | 
         | This is why I think here in the UK SAGE's SPI-B group using
         | behavioural psychology to increase the perception of personal
         | threat as part of the anti-coronavirus measures was such a
         | dangerous and short-sighted policy. Using fear might be a
         | convenient way to convince people into doing what you want for
         | a while, but fear also drives people into the welcoming arms of
         | all kinds of nasty ideologies. That cat's out of the bag now
         | too, I suspect using fear to "nudge" the public into doing what
         | the government of the day wants will become a much more common
         | feature of liberal democracies in the future now. We've done a
         | deal with the devil and he always collects in the end.
        
           | Clubber wrote:
           | >Fear for yourself, your family, and your future will
           | apparently cause people to abandon all sorts of supposedly
           | cherished beliefs.
           | 
           | And we (the US) must realize we've been put under a constant
           | state of fear by news / advertisers for the last 30+ years.
           | We don't really even realize that we aren't in a "neutral"
           | mental state because we've been constantly bombarded by fear
           | mongering. The constant fear is "normal" here. It wasn't
           | always like this. This is why the "for the children," is so
           | effective at infringing on our rights, we are afraid for the
           | wellbeing our kids, more so than ourselves. It's nefarious to
           | use our fear for our children to pass controversial
           | legislation. I mean stop to think about how evil that is.
           | 
           | What would our country look like if we weren't being
           | constantly being programmed to be afraid?
        
             | BoxOfRain wrote:
             | >It's nefarious to use our fear for our children to pass
             | controversial legislation. I mean stop to think about how
             | evil that is.
             | 
             | It's absolutely contemptible, yet as Western societies we
             | spend so much time criticising our neighbours and blaming
             | them for our problems rather than criticising the people
             | amping up the fear on us. If I could change just one thing
             | about society I'd introduce some sort of "immune system"
             | against those who try and use fear to manipulate people,
             | the correct response to fearmongering is contempt towards
             | those responsible for it in my opinion.
             | 
             | >What would our country look like if we weren't being
             | constantly being programmed to be afraid?
             | 
             | I'm not American, but here in the UK we face exactly the
             | same issue. I think both of our countries would be
             | unrecognisable and probably a lot better than they are
             | today. How much avoidable inequality exists because the
             | fear of Russian-style Bolshevism harmed moderate left-wing
             | policies in the 20th century? How much avoidable
             | authoritarianism would we have if the War on Terror hadn't
             | itself become an intense source of domestic fear in the
             | 21st? Fear is the fountain from which all tyranny and
             | bigotry springs in my opinion.
             | 
             | It's not just politics that would be affected, every aspect
             | of society would be changed for the better I think. Maybe
             | that's the form a modern Enlightenment would take, an
             | active rejection of fear and promotion of courage and
             | tolerance for dissent in its place.
        
         | unethical_ban wrote:
         | Disinformation is doing more harm to society than encrypted
         | terrorist emails ever could. The "Disinformation vs. free
         | speech" will be a defining balance of the next decade, even
         | more fraught than "privacy vs. security" in cyberspace.
         | 
         | I'm not censoring someone because they want low taxes, high
         | taxes, disbanding the department of education, or giving every
         | poor person a new car.
         | 
         | There _is_ a line to be drawn, somewhere, one when speech
         | crosses from the expression of political thought and free
         | thinking, and into the willful (or the amplification of
         | willful), factually incorrect statements generated by
         | sophisticated trolls and adversary nation-states.
         | 
         | My personal take: The Internet is a vital tool for free
         | expression, and as such, a "floor" of free expression should be
         | permitted. Ensure that DNS and ISP connectivity cannot be
         | removed from someone based on the legal expression of thought.
         | Those are infrastructure.
         | 
         | Youtube, Facebook, and other amplification platforms? In the
         | short term, I don't see how we force them to host actively
         | harmful content without recategorizing their role in society.
         | 
         | edit to respond to iosono88 (since HN is throttling my ability
         | to rebut): I'll keep my response simple: I also don't like
         | payment processors being used as control mechanisms for legal
         | activities.
        
           | wutbrodo wrote:
           | > In the short term, I don't see how we force them to host
           | actively harmful content without recategorizing their role in
           | society.
           | 
           | This recent flip from "Facebook et al aren't doing enough,
           | put the screws on them" to "we can't force Facebook not to
           | censor" is quite disingenuous. These companies and their
           | founders are famously liberal, and were dragged kicking and
           | screaming into ever more heavy-handed moderation, by both
           | public opinion and veiled threats of regulation from
           | politicians. There's plenty of statements on the record of eg
           | Zuckerberg saying what was obvious to most of us: nobody,
           | including Facebook, wants them in the position of deciding
           | what is true and what is false.
           | 
           | Leaving aside whether content moderation is a good thing,
           | let's not pretend that the situation here is that Facebook
           | really wanted to become the arbiter of truth and
           | misinformation and we can't stop them from being so.
        
             | y4mi wrote:
             | They already were at that point, regardless of what people
             | claim.
             | 
             | Facebook had the ability to paint whatever picture they
             | wanted as truth by controlling what their users saw at what
             | time. And they utilized it proudly long before COVID to
             | increase engagement.
        
               | wutbrodo wrote:
               | Sure, and long before Covid, that was a valid (and much-
               | expressed) criticism of them, as well as a general
               | criticism of using non-federated platforms.
               | 
               | But expanding explicitly into deciding what users are
               | allowed to see and express to each other is a million
               | times worse than the type of banal malevolence that
               | arises from "show people what they like to see".
        
               | unethical_ban wrote:
               | There's the fundamental difference! I love finding the
               | "fundamental difference". The crux, the place where
               | philosophies diverge, where the understandings break
               | down.
               | 
               | I fundamentally disagree with the premise that "keeping
               | harmful, intentional disinformation away from people" is
               | worse than "letting people unknowingly subscribe to
               | disinformation".
               | 
               | I would argue, perhaps, that there should be open
               | policies on what topics are off-limits. That Facebook,
               | et. al. should have to document to the public what
               | "viewpoints" and disinformation they limit - and
               | furthermore, that more of these content-display
               | algorithms should be auditable, competitive secrecy be
               | damned.
               | 
               | I wouldn't call hundreds or thousands of people dying due
               | to disinformation-backed vaccine skepticism "banal",
               | either.
        
               | wutbrodo wrote:
               | > "keeping harmful, intentional disinformation away from
               | people"
               | 
               | This is begging the question though. It's assuming that
               | "harmful, intentional disinformation" is 1) well-defined
               | and 2) always going to be determined by those with your
               | best interests at heart. It relies on a blind faith in
               | the fundamental and eternal moral purity of Facebook and
               | other mega-corporations. I wholeheartedly disagree that
               | they fit this mold.
               | 
               | This is true even if you turn your religious passion
               | towards government institutions instead of Facebook. Do
               | you similarly agree that criminal trials and due process
               | are unnecessary? After all, the same pre-hoc confidence
               | in your ability to categorize without rigor leads to "Why
               | would we want criminals going free due to technicalities
               | and lawyer's tricks?". I assume you don't agree with this
               | statement, because in that context, you've internalized
               | the idea that institutions are both corruptible and
               | flawed even when uncorrupted, and it behooves us as a
               | society to have some epistemic humility instead of
               | pretending that Truth is carved onto clay tablets and
               | handed down from God.
               | 
               | If you've paid any attention to the pandemic, you'd know
               | that even a situation where government is in full control
               | of defining "misinformation" can be consistently and
               | significantly misleading. "Mask truther" used to mean
               | someone who thought wearing masks was a good idea for
               | preventing spread, discussing the lab leak hypothesis was
               | "misinformation", the vaccines were "rushed and unsafe"
               | until Trump was out of office, etc etc etc. It's hard to
               | pick a topic where it wasn't trivial to front-run
               | official advice by several months, repeatedly, over the
               | entire pandemic.
               | 
               | It's a bit of a paradox: The very certainty and
               | (illusory) Truth-with-a-capital-T that you take for
               | granted is forged through a process of skepticism,
               | second-guessing, and constant poking at beliefs.
               | Hamstringing that process is like killing the golden
               | goose to make more room for all the eggs you plan to
               | have.
        
               | unethical_ban wrote:
               | Your entire paragraph of "if you've paid any attention"
               | are questionable.
               | 
               | I never heard people get mocked at any point for wearing
               | a mask during the pandemic, even in the beginning when
               | the CDC said it wasn't necessary.
               | 
               | In my PERSONAL opinion, the "lab leak" theory was never
               | misinformation, but uninformation: Discussion pushed
               | forward by right-wing outlets to generate an enemy, a
               | "they" to blame. They used it as a cudgel without
               | evidence against Anthony Fauci, and they beat the shit
               | out of Asians because of it. Most importantly, it was
               | completely irrelevant to the extent of us debating it,
               | when the focus was on containing a disease for which
               | there was no cure or vaccine.
               | 
               | And while there was some public skepticism about the pace
               | of the vaccine process, I likewise don't think there was
               | a "switch-flip" of trust in it like you suggest. When it
               | came time to take it, everyone who wasn't a vaccine
               | skeptic already went to get it when they could, and
               | clearly the Trump admin was in charge through the
               | development of the vaccine.
               | 
               | ---
               | 
               | There is also a difference between someone posting "I do
               | not trust the government not to be incompetent, or not to
               | run a mass trial on people" (though I think those people
               | are nuts re: the vaccine), and someone saying "I know
               | that Bill Gates and Anthony Fauci put microchips into a
               | bone-melting serum that will activate in a year!"
               | 
               | It's a huge, multi-faceted issue. In the end, the problem
               | to TRY to solve in coming years will be sifting between
               | legitimate skepticism and good-faith debate, and nation-
               | states/Fox-News lies that intend to manipulate you into
               | anger and division, and whether private entities have the
               | obligation to allow harmful information across their
               | channels.
        
           | iosono88 wrote:
           | Youve added an addendum to protect DNS and ISP both of which
           | have been used to censor citizens. Thoughts on pulling
           | payment processing via very generalized Chokepoint?
           | 
           | https://en.m.wikipedia.org/wiki/Operation_Choke_Point
        
         | kgwxd wrote:
         | IMO, if the government isn't using the extraordinary power's
         | granted to them (search, seizure, arrests, fines, imprisonment,
         | etc) against it's citizens strictly for the words they say or
         | write, "free speech" hasn't been violated and it would be
         | incorrect to use the term in that context.
         | 
         | I think even a government hosted forum could filter content
         | without violating even the spirit of "free speech", and I'm
         | pretty sure it wouldn't legally be considered a violation of
         | the first amendment, even if that filtering reached blatant
         | censorship levels.
         | 
         | Censorship, government or private, is a separate concern from
         | freedom of speech, and using one to make arguments about the
         | other doesn't make any sense to me.
        
           | throwaway894345 wrote:
           | > IMO, if the government isn't using the extraordinary
           | power's granted to them (search, seizure, arrests, fines,
           | imprisonment, etc) against it's citizens strictly for the
           | words they say or write, "free speech" hasn't been violated
           | and it would be incorrect to use the term in that context.
           | 
           | "Free speech" is overloaded. There's the abstract concept of
           | "free speech" and then there's the first amendment which
           | specifically limits what the US government is allowed to
           | censor.
        
         | ggggtez wrote:
         | How can you support free speech but prevent a company from
         | exerting that speech?
         | 
         | YouTube banning antivax content _is speech_.
         | 
         | You want the government to start mandating that a company can't
         | take a stance on important issues of healthcare? Churches spend
         | all day every day taking stances on abortion. You want to the
         | government to tell them they can't take a side?
        
           | ThrowawayR2 wrote:
           | > " _How can you support free speech but prevent a company
           | from exerting that speech?_ "
           | 
           | Corporations are not humans (regardless of the "corporate
           | personhood" doctrine) and thus should not be entitled to the
           | full rights of humans. Semi-monopolies like Youtube are
           | _especially_ not entitled to use their dominance to
           | manipulate public opinion, given how easily it can be abused.
           | 
           | Ask yourself, if YouTube were pushing conspiracy content and
           | suppressing pro-vaccination content instead would the parent
           | poster and those like them still be saying what they are
           | saying? Fair-weather friends indeed.
           | 
           | > " _You want to the government to tell them they can 't take
           | a side?_"
           | 
           | The United States government already can and has in the past;
           | see https://en.wikipedia.org/wiki/Equal-time_rule and
           | https://en.wikipedia.org/wiki/FCC_fairness_doctrine .
        
             | joshuamorton wrote:
             | > Corporations are not humans (regardless of the "corporate
             | personhood" doctrine) and thus should not be entitled to
             | the full rights of humans.
             | 
             | True, but corporations are just connections of people with
             | shared goals. Should groups of people lose "fundamental"
             | rights when they organize?
             | 
             | > Fair-weather friends indeed.
             | 
             | Yes, I fully support the rights of platforms to do stupid
             | things. Use rumble or whatever if you want. I'll mock those
             | platforms, but I don't think the government should ban
             | them.
             | 
             | > Semi-monopolies like Youtube are especially not entitled
             | to use their dominance to manipulate public opinion, given
             | how easily it can be abused.
             | 
             | So, you think that we should circumstantially limit
             | constitutionally protected rights, for the greater good?
             | Fair weather friends indeed.
        
               | stale2002 wrote:
               | > So, you think that we should circumstantially limit
               | constitutionally protected rights
               | 
               | According to your logic, you should think that common
               | carrier laws should be repealed entirely.
               | 
               | Think, the telephone company blocking certain political
               | groups, or the only water company in town, refusing to
               | deliver water to certain people who say things that they
               | don't like.
               | 
               | It is pretty similar, philosophically. Common carrier
               | laws are pretty uncontroversial.
               | 
               | So it feels weird for you to be making these types of
               | arguments, when it is already established, that there are
               | major counter examples.
               | 
               | So you'd have to either recognize the contradiction, or
               | admit that your position is at odds with other
               | established, and uncontroversial laws.
        
               | joshuamorton wrote:
               | I didn't take any particular position, I pointed out the
               | incongruence in the one they espoused.
               | 
               | I'm admittedly mixed on common carrier laws, I think that
               | they are "impure" in a sense, but I also think the
               | benefits are greater than the costs, even taking into
               | account the potential theoretical erosion of our rights.
               | 
               | I absolutely agree that there are major counterexamples,
               | and I'm overall _fine with that_ , but I'll also freely
               | admit that I'm not a free speech absolutist of any form.
        
               | ThrowawayR2 wrote:
               | > " _So, you think that we should circumstantially limit
               | constitutionally protected rights, for the greater good?_
               | "
               | 
               | Why, yes, I do. Ask yourself: what was the various civil
               | rights victories and legislation for minorities, women,
               | LGBTQ other than saying " _We are circumscribing your
               | rights, including ones formerly interpreted as
               | constitutionally protected, so that these protected
               | classes may be treated equally, for the greater good_ "?
               | Sounds like you'd argue against that.
        
               | joshuamorton wrote:
               | I mean, yes. I think that's fine. But that goes in both
               | directions: if you're willing to sacrifice the speech of
               | some for the speech of others, clearly either you aren't
               | a free speech maximalist, or there is some inherent
               | contradiction in what free speech is. Because if my free
               | speech requires limiting yours, well...how do we decide
               | whose speech is more important?
        
           | tomjen3 wrote:
           | Youtube is a monopoly, and monopolies should be limited in
           | the same way the government is, and for the same reason. This
           | also applies to groups of otherwise independent businesses
           | that operate in concert.
           | 
           | That, and adding political beliefs to the list of protected
           | classes, is what is necessary to start the US healing
           | processes. Until there is no other option but to talk with
           | the people you despise, neither side will start doing it.
        
           | londgine wrote:
           | I like the free market. If a store who also likes the free
           | market decided to raise their price significantly because
           | they claim to be better than everyone else then I will still
           | think that is their right in the free market. However, since
           | I value the free market so much I won't buy from them.
           | Similarly, if YouTube wants to exercise their right of
           | freedom of expression to censor content then I, as someone
           | who values freedom of expression will use them less.
           | Unfortunately, while in the first example many people would
           | behave like me and cause the store to lower their price, not
           | that many people value freedom of expression for YouTube to
           | care about loosing those people.
        
             | BitwiseFool wrote:
             | I think discussions of the free market need to include
             | scale. Scale absolutely matters when it comes to "voting
             | with your wallet", or I suppose, in this case, your usage
             | of a platform.
             | 
             | I think our notions on the merits of a free market, and
             | indeed, the very understanding of a free market itself,
             | come from a time _before_ the network effect and the de-
             | facto digital monopolies we see today.
        
           | Covzire wrote:
           | No church has a monopoly on the public square that is being
           | exploited by one political party or corporate interest.
        
           | zionic wrote:
           | >How can you support free speech but prevent a company from
           | exerting that speech?
           | 
           | Because these "platforms" are in fact utilities.
           | 
           | We have allowed corporations to own and control the common
           | square and bypass rights our forefathers fought wars to
           | establish.
           | 
           | The gov has been all too lenient enforcing laws against these
           | giants because it allows them to censor-by-proxy.
           | 
           | For the left-leaning among, please recall that the definition
           | of fascism is "the merger of state and corporate power".
        
             | Supermancho wrote:
             | As Thiel says, (in whatever form it ultimately takes) the
             | free market is a selector for monopolies. At peak
             | capitalism, you still have start ups competing with an
             | increasingly low chance of success excepting
             | scandals...which are inevitable in large organizations.
             | 
             | The biggest companies are basically utilities and that will
             | not change anytime soon. The market has resulted in this
             | condition. The government has to play catchup, as usual.
        
           | drewcoo wrote:
           | > You want the government to start mandating that a company
           | can't take a stance . . .
           | 
           | I, for one, want less monopolistic media so that the people
           | can exert viewership pressure; they can get their media
           | elsewhere and the ad money will follow. The content being
           | stopped is not the only loss of people's voices happening
           | here.
        
           | leetcrew wrote:
           | > How can you support free speech but prevent a company from
           | exerting that speech?
           | 
           | I can be deeply disappointed by youtube's moderation
           | decisions without suggesting that the company be _compelled_
           | to allow certain content. as an aside, I find it frustrating
           | to see people constantly swapping between  "free speech" as a
           | legal concept and "free speech" as an abstract ideal in these
           | threads. we talk past each other the same way every time the
           | debate comes up. just because the law is written the way it
           | is doesn't mean that's necessarily the way it should be. and
           | even if we can't write the law "just right", we can still
           | advocate for higher principles to be followed.
           | 
           | anyways, I generally agree with the "companies can manage
           | their properties as they see fit" line of thought. but it
           | becomes problematic when our public spaces are increasingly
           | controlled by a small number of huge companies that mostly
           | share the same politics. I'm not really sure what the
           | solution is, but it sucks to watch it unfold.
        
           | AnimalMuppet wrote:
           | The problem is that Youtube is big enough, and carries enough
           | of the global conversation, that we think it should be a
           | common carrier. (Think of the phone company back in the day.
           | They didn't care if you were literally the Nazi Party of
           | America, they carried your phone calls just like everybody
           | else's.) People kind of think of Youtube that way, even
           | though, legally, Youtube isn't playing by those rules.
           | 
           | But there's also this two-faced evaluation of Youtube. When
           | Youtube blocks the other side, people say "private company,
           | First Amendment, they can carry what they want". But when
           | Youtube blocks _their_ side, people at least feel the
           | violation of the  "common carrier" expectation, and get
           | upset.
           | 
           | So maybe it's time for us as a society to decide: Has Youtube
           | (and Facebook, and Twitter, and Google) gotten big enough and
           | important enough that they should be regulated into some kind
           | of "common carrier" status? Or do we want them to continue as
           | they are?
        
           | stale2002 wrote:
           | > You want the government to start mandating that a company
           | 
           | There is already an established history of requiring certain
           | large communication platforms, to act a certain way.
           | 
           | They are called common carrier laws, and already apply to
           | things like the telephone network.
           | 
           | Sure, they don't currently apply to other things, but the law
           | could be updated, so that they do.
           | 
           | Philosophically, common carrier laws are uncontroversial, and
           | already apply to major communication platforms, so you don't
           | get to pretend like this is unprecedented.
        
         | jan_Inkepa wrote:
         | I had the same realism about secularism when a catholic became
         | head of the big secularist-leaning party in my then county of
         | residence. People were sure interested in his private feelings
         | about sinfulness, or said that they would never vote for a
         | party with a religious leader. The party angled itself as
         | secularist, but that particular kerfuffle revealed it to be a
         | highly contingent value of the membership (and of people
         | generally).
         | 
         | It's good to know what people really care about, and what
         | beliefs are negotiable.
        
           | iammisc wrote:
           | I'm confused.. isn't a secularist party in favor of not
           | involving personal religious beliefs in Civic issues? In that
           | light, what's wrong with someone with personal religious
           | beliefs from participating? Others made his religion
           | important not him, from what you describe
        
             | Broken_Hippo wrote:
             | I'm sorry, but: Wouldn't it be important to make sure the
             | person _actually_ wants secular values in politics when
             | parts of their religion doesn 't think the same way?
             | 
             | I mean, if he doesn't personally think folks should be
             | using birth control or that same sex marriage is bad (as
             | the Catholic Church does), isn't it important to ask how
             | they deal with this? Does it come out in votes or does the
             | person think that they should live up to a stricter moral
             | code than what law dictates?
             | 
             | If you don't find this stuff out, you might wind up in a
             | position where the law reflects religious values rather
             | than broader secular ones. You don't have to have an
             | official religion for this to happen, merely enough
             | religious folks in office that vote with their religion.
        
               | jan_Inkepa wrote:
               | Hey, I think your comment is interesting and raises a
               | number of valid points, and I basically agree with you. I
               | spent about 40 minutes drafting and redrafting more
               | detailed replies, but everything I could think of saying
               | read like kicking off a very rote/standard internet
               | discussion about politics and religion, and I wouldn't
               | wish that upon either of us.
        
             | naasking wrote:
             | Exactly, it's like the unfortunately common belief that
             | religious people can't be scientists. Let their actions
             | speak for themselves; if they are unreasonably biased, it
             | will show.
        
             | Robotbeat wrote:
             | That's the whole point I think
        
             | jan_Inkepa wrote:
             | Right, that was the surprise. It should be what you said,
             | but in practice it became the tribe-of-choice for people
             | who wanted not just religion out of government but also
             | religious people.
        
         | LudvigVanHassen wrote:
         | And it's a damning realization at that. Nevertheless, it seems
         | to be true.
         | 
         | How do we maintain the emphasis on free speech with our higher
         | fragmented population who have WIDELY varying beliefs and
         | little shared culture?
        
           | rsj_hn wrote:
           | > How do we maintain the emphasis on free speech with our
           | higher fragmented population who have WIDELY varying beliefs
           | and little shared culture?
           | 
           | We cannot. Diversity, Freedom, Centralization. Pick two.
           | 
           | What has kinda worked in some countries is splitting up into
           | different areas and letting them run their own affairs --
           | e.g. swiss style cantons.
           | 
           | But that type of arrangement is very fragile given the
           | organizational advantages of centralization and countries
           | like Switzerland are notable because of their exceptionality.
        
         | mardoz wrote:
         | What free speech advocates ignore here is that even they
         | generally draw a line somewhere. Death threats, defamation,
         | pedophilia, sharing bomb making materials etc are usually
         | accepted by everyone to not be acceptable.
         | 
         | 'But those are different' is usually the argument here. But why
         | though? Because they cause harm? Doesn't inciting racial hatred
         | cause harm?
         | 
         | Once we stop arguing over the issue being black/white but
         | instead discuss _where exactly_ we draw the line then I think
         | we are finally having a far more honest discussion. Just
         | because some speech is illegal and other speech isn't illegal
         | shouldn't be the deciding factor on whether someone (or some
         | company) needs to platform that view. That's then leaving it to
         | governments to decide what is ok and what isn't, instead leave
         | it to society to choose not to propogate hateful and
         | distasteful messages.
         | 
         | I also find it frustrating that some armchair psychologists
         | have decided that people like me with more nuanced views simply
         | want to repress and censor things that we disagree with which
         | is not what it's about at all.
        
           | SuoDuanDao wrote:
           | >I also find it frustrating that some armchair psychologists
           | have decided that people like me with more nuanced views
           | simply want to repress and censor things that we disagree
           | with which is not what it's about at all.
           | 
           | Well, you've got to draw the line somewhere, right? /s
           | 
           | But seriously, I think one can be a free speech advocate
           | without being a free speech absolutist, _and_ believe that we
           | are heading in the wrong direction. In fact I think it 's
           | wrong to think of 'where to draw the line' at all, because
           | what is acceptable discourse is not, and should not, be
           | thought of as static.
           | 
           | I think a much better question is how 'the line' is shifting,
           | as the amount of things we can't talk about is both a lagging
           | indicator of institutional health (because being able to have
           | uncomfortable conversations is a sign of emotional maturity),
           | and a leading one (because public discourse is necessary to
           | solving problems we don't understand or would rather not
           | acknowledge).
        
           | NaturalPhallacy wrote:
           | > _What free speech advocates ignore here is that even they
           | generally draw a line somewhere._
           | 
           | The issue isn't where the line is drawn. The issue is that no
           | singular entity can be trusted to draw that line.
        
           | idiotsecant wrote:
           | >What free speech advocates ignore here is that even they
           | generally draw a line somewhere.
           | 
           | Yes, that is a problem of some discussion in government. To
           | solve this problem researchers have created so-called
           | 'crimes', which are violations of 'laws' which define the
           | things we may and may not do, and people become subject to
           | 'punishments' if disobeyed.
           | 
           | Leading thinkers have suggested that the way to make things
           | 'illegal' is to pass 'laws' restricting them.
        
           | dude187 wrote:
           | Nobody is bound to that stance. Some people, myself included,
           | believe that there is no line. You have the right to say
           | anything.
        
             | short_sells_poo wrote:
             | I agree with this line of reasoning, but then we have to
             | ask the question: are platforms obligated to host all
             | speech that is not illegal (in whatever jurisdiction they
             | reside in)? Should they be obligated? If I as a private
             | citizen decide to create a platform to host discourse, do I
             | have the freedom to decide what is permitted there?
             | 
             | If the answers to the followup questions are "yes", then we
             | arrive to the status quo, where most of the big platforms
             | heavily censor the content. This means that if you want to
             | say something that'd be censored, you have the right to do
             | so, but you don't have the means. I suppose you can walk
             | out into the street and say to the people there, but
             | there's a certain lack of reach in that approach :)
             | 
             | So there's also a practical aspect, where you may have full
             | freedom of speech by law, and yet in reality you have no
             | freedom because nobody will give you the possibility to
             | actually communicate what you want to say. You may try to
             | build your own platform, but then you run into second order
             | problems where you'll find that no service provider will
             | want to host your servers.
             | 
             | At one point, you may hit a barrier where you have no
             | monetary means to build all the infrastructure necessary to
             | be able to provide a truly free speech shelter.
        
               | poszlem wrote:
               | The reason why platforms are expected to host even the
               | speech they don't agree with is the same why some bakers
               | are forced to make cakes for homosexual weddings. If you
               | compel to the latter, you should also compel to the
               | former and vice-versa.
        
               | dude187 wrote:
               | I believe that common carrier applies to platforms, not
               | just the infrastructure of wires. These platforms could
               | not exist without the large privilege given to them I'm
               | immunity to the illegal content they serve up.
               | 
               | The problem is that these platforms have it both ways.
               | They can censor entire political parties, yet play dumb
               | and cry immunity "we're just a platform" when there's
               | literal illegal content that makes it's way to their
               | public hosting.
               | 
               | Once they censor based on content, they should lose all
               | until and be considered a publisher of that content.
               | They're no longer a passthrough, they're now actively
               | working to manipulate opinions
        
           | readams wrote:
           | US law defines the line as speech that incites imminent
           | lawless action. The speech must not only be encouraging such
           | action, it must be imminent and likely. This is in practice a
           | pretty good line.
           | 
           | If you think that private companies should censor much more
           | heavily, then you obviously don't really believe in free
           | speech. There's no obvious reason why it's more acceptable
           | because all the dominant communication platforms censor
           | speech vs the government doing it.
        
             | micromacrofoot wrote:
             | If someone replies to my comment and says "I'm going to
             | kill you" how do I determine how imminent or likely it is?
             | Should that be protected under free speech? I find this
             | incredibly hard to navigate.
        
             | dzader wrote:
             | you're forced to follow laws, you aren't forced to post
             | memes on twitter.
        
           | throwaway894345 wrote:
           | > What free speech advocates ignore here is that even they
           | generally draw a line somewhere. Death threats, defamation,
           | pedophilia, sharing bomb making materials etc are usually
           | accepted by everyone to not be acceptable.
           | 
           | Assuming we're talking about the abstract concept of free
           | speech (as opposed to 1A, which dictates what the US
           | government is allowed to censor), the "line" is around
           | expression of ideas. A death threat isn't "expressing an
           | idea", it's coercing someone with violence. Similarly
           | "harassment" which you didn't mention, falls out of bounds of
           | speech because it violates another's right of association
           | (you have the right to speak, but you can't force me to
           | listen). Pedophilia obviously isn't an expression of an idea,
           | although pedophilic advocacy, while repugnant, is still in
           | bounds of free speech by definition.
           | 
           | Whether platforms are obliged to adhere to a "free speech"
           | standard is a different question. Personally, I think so much
           | of our speech is flowing through a handful of these large
           | platforms that they are the _de facto_ public square, and
           | should be regulated accordingly or broken up. Even if they
           | could articulate a clear moderation policy and enforce it
           | fairly, simply having that much power to determine who sees
           | what content for so many citizens is concerning. Even if you
           | 're a liberal or progressive and thus largely enjoy the
           | alleged bias in platforms' moderation policies/enforcement,
           | recall that in 2016-2017 we were _virtually certain_ that
           | Russia was manipulating Twitter to influence the US
           | presidential election--if you believe Russia can _indirectly_
           | influence our elections via Twitter, then it necessarily
           | follows that Twitter can _directly_ influence our elections
           | and surely that 's too much power to give to a _corporation_.
           | 
           | > 'But those are different' is usually the argument here. But
           | why though? Because they cause harm? Doesn't inciting racial
           | hatred cause harm?
           | 
           | I think the issue is that many don't trust platforms to
           | enforce their own policies consistently. On Twitter for
           | example, one gets the impression that it's okay to incite
           | racial hatred toward whites, Asians, Jews, and even
           | "ideologically diverse" blacks/etc, which is to say that the
           | policy is neutral but the enforcement is biased--and that
           | biased enforcement constitutes harm. Of course, a racist
           | might respond "Good, we _should_ punish whites, Asians, and
           | Jews for their race because historically other whites,
           | Asians, and Jews _have enjoyed_ various advantages because of
           | their race ", but presumably the goal is to minimize racism.
        
             | lowbloodsugar wrote:
             | Compare:
             | 
             | "My followers, I order you to go out and kill this man"
             | 
             | vs
             | 
             | "I wonder if the world would be a better place if this man
             | did not exist"
             | 
             | However you define free speech, anyone can order someone
             | killed while being acceptable to your rules.
        
               | throwaway894345 wrote:
               | Defining free speech is easy, adjudicating it is
               | sometimes hard. In this case, a threat requires the
               | intent to compel (note that compulsion != persuasion).
               | Determining whether "I wonder if the world would be a
               | better place if this man did not exist" is intended to
               | compel or not is harder.
               | 
               | But most free speech absolutists will be pretty content
               | if we get to a point where the thrust of the free speech
               | debate concerns itself with outlier cases like this one
               | (rather than "is it 'hate speech' to criticize woke
               | excesses?" or "to use a Chinese word that sounds vaguely
               | like an English racial slur?").
        
               | jancsika wrote:
               | > But most free speech absolutists will be pretty content
               | if we get to a point where the thrust of the free speech
               | debate concerns itself with outlier cases like this one
               | (rather than "is it 'hate speech' to criticize woke
               | excesses?" or "to use a Chinese word that sounds vaguely
               | like an English racial slur?").
               | 
               | In your absolutist world how do you stop the trolls on
               | the current incarnation of social media from flooding the
               | medium with references to these outlier cases until it
               | triggers censorship?
               | 
               | When those same trolls continue pentesting the medium
               | until they trigger censorship on less direct references,
               | you're going to be left with examples functionally
               | equivalent to the ones you are comparing to above.
        
         | bitwize wrote:
         | New Zealand has a Censorship Office and an official called the
         | Chief Censor. The government has the power to declare that
         | certain communications are illegal and prosecute people for
         | making those communications. They recently, famously did this
         | with footage from the Christchurch shootings as well as the
         | shooter's manifesto.
         | 
         | Yet New Zealand consistently tops Cato Institute's freedom
         | index.
         | 
         | Maybe, just maybe, an absolutist reading of free speech that
         | favors spreading of hate and misinformation, as we have in the
         | USA ever since _Brandenburg v. Ohio_ in 1969[0] is not a
         | necessary condition for freedom. Maybe restricting speech
         | actually has a public benefit and can increase the freedom and
         | well-being of individuals with no hateful or dishonest intent.
         | 
         | [0] Note that the plaintiff was a KKK member, so yes, the
         | current bedrock court ruling for free speech in the USA was
         | crafted specifically to protect hate speech.
        
           | mardifoufs wrote:
           | >Yet New Zealand consistently tops Cato Institute's freedom
           | index.
           | 
           | Well I'm sure those who got arrested for sharing the
           | Christchurch footage (footage that was, btw, widely available
           | and shared worldwide) will be happy to hear that. They may be
           | in jail but at least the Cato institute declared that they
           | were still living in a country that tops their index. An
           | index that hasn't even changed or taken in consideration the
           | extreme drift towards authoritarianism that we have seen in
           | the 2020.
        
           | Jiro wrote:
           | If you can ban something for being hate speech, calling
           | something hate speech becomes a weapon. I absolutely don't
           | trust any of the people banning "hate speech" to not have
           | double standards that make it easy to call anything
           | politically disagreeing with them "hate speech".
        
           | AnimalMuppet wrote:
           | From a personal perspective, I don't want to have to wade
           | through everyone else's "free expression", for the same
           | reason that I don't want to have to wade through every
           | company's advertisements. Experientially, I am less free when
           | everyone can interrupt my brain space with what they want to
           | shove at me. I want limited amounts of high-quality,
           | interesting communication, not the noise of everybody's
           | everything. (Think in terms of Shannon information theory.)
           | 
           | So I kind of see your point. Filtering out the garbage is no
           | loss to me. It make me freer rather than less free. And
           | yet...
           | 
           | The failure mode of the New Zealand approach is to have
           | someone who is partisan hold the office of Chief Censor.
           | Worse, someone who is a dedicate partisan might see the value
           | of holding that position, and might deliberately,
           | systematically seek it, hiding their partisan-ness until they
           | obtained it. Sooner or later, someone's going to at least
           | try.
        
           | BitwiseFool wrote:
           | >"Yet New Zealand consistently tops Cato Institute's freedom
           | index."
           | 
           | Good for them, but that that index is meaningless to me.
           | Truthfully, I have no respect for arguments that twist acts
           | of suppressing freedom into acts seen as enhancing or
           | preserving freedom.
        
         | [deleted]
        
         | nonameiguess wrote:
         | What changed in 2020? Do you not remember the Dixie Chicks
         | getting blacklisted for speaking against George Bush, Janet
         | Jackson being deplatformed from record labels for showing a
         | nipple on television, the LA DNC in 2000 cordoning off
         | protesters into "free speech zones" 4 blocks away from the
         | convention, gangsta rap and heavy metal being censored in the
         | 80s and 90s, pushes for civil rights being met with dogs and
         | firehoses in the 60s, Lenny Bruce being imprisoned for stand up
         | comedy, interracial marriage being illegal, teachers being
         | fired for being gay, American citizens having all of their
         | property confiscated because they were ethnically Japanese? We
         | had anti-sedition laws passed within 22 years of the
         | Constitution being ratified.
         | 
         | Most of these things that aren't a matter of private freedom of
         | association for platform owners tend to eventually get struck
         | down because the Supreme Court can somewhat reliably be counted
         | on to eventually respect the Constitution, but the general
         | public and elected politicians have never supported free speech
         | or free anything. I would say almost the exact opposite of what
         | you said. People only claim to love free speech when something
         | they want to say is unpopular or suppressed. Almost nobody just
         | supports it generally. The ACLU used to be pretty reliable
         | about this, i.e. defending the KKK and Nazis, but I'm not even
         | really sure where they stand any more.
         | 
         | The average citizen doesn't give a crap about any abstract
         | ideals at all. They just want to live their lives and possibly
         | raise a family in an atmosphere not dissonant with their own
         | cultural traditions and beliefs. Allowing people with other
         | traditions and beliefs to spread those via public advocacy,
         | art, or any other means that may lead to them being mainstream
         | or even dominant is antithetical to that.
        
           | brandonmenc wrote:
           | > What changed in 2020?
           | 
           | Liberals started doing it.
        
             | adamrezich wrote:
             | you're going to continue to be downvoted but it's the
             | truth. when Jon Stewart was going against the grain of the
             | widely-accepted political narrative of the Bush
             | administration, it was easy to agree with him calling out
             | bullshit. when the political pendulum swung the other way,
             | however...
        
             | ThrowawayR2 wrote:
             | > " _Liberals started doing it._ "
             | 
             | Let's be precise; the _left_ started doing it. We are
             | literally discussing how big parts of the political left in
             | the US is moving away from the liberal ideals and
             | philosophy that it once embraced and championed.
        
               | jessaustin wrote:
               | There should be a term for this, when conservatives in
               | USA have so little exposure to actual leftists that they
               | confuse MSNBC, the Democratic Party, and censorious
               | anklebiters with "the Left". Here's a hint: actual
               | leftists aren't cheering on the persecution of Assange,
               | like everyone on MSNBC is.
        
               | [deleted]
        
               | ThrowawayR2 wrote:
               | No true Leftist, hmm? Okay, I'll humor you, what should
               | we call "MSNBC, the Democratic Party, and censorious
               | anklebiters*" (sic) and the rest of the broad coalition
               | to the left of the American center, if not the left?
               | 
               | Aside from that, if you're suggesting I'm a conservative
               | (no worries, it's a common mistake; it's tough being
               | politically homeless), I'm a liberal unhappy with the
               | broad coalition that used to be called the left. If that
               | makes me a "conservative" in your eyes, well, there
               | probably should be a term for that too.
        
               | gremIin wrote:
               | The term exists and it is called the Overton Window. It
               | has been an intentional tactic and it is happening across
               | several political spectra. For example, being against
               | illegal immigration now gets you labelled anti-
               | immigration. Wanting the police to be held accountable
               | for their actions is now considered leftist instead of
               | normal. Being anti-vax is considered a personal choice
               | worthy of consideration instead of a fringe/lunatic view.
               | 
               | I would be very happy if I were Russia/China.
        
             | CheezeIt wrote:
             | Tipper Gore and FDR weren't liberal?
        
         | api wrote:
         | I have to be honest and say that my faith in the capacity of
         | people to think in a free speech totally open environment was
         | severely tested over the past 5-10 years. Things like Qanon,
         | flat Earth, antivax hysteria, the meme-driven return of both
         | "right" and "left" totalitarian ideologies from the early 20th
         | century that should be completely discredited, and so on have
         | made me wonder if most people simply can't handle exposure to
         | unregulated content as they lack the ability to think
         | critically. I've actually wondered if most people might not
         | need to be protected from unregulated content in the same way
         | that people need to be protected from exposure to lead, radon
         | gas, etc.
         | 
         | The human brain simply didn't evolve in this environment.
         | Throughout 99.9% of human history a person's ideas came from
         | the tribe or neighborhood and came with the context of culture,
         | social relationships, and physical body language cues. The
         | brain did not evolve to process a context-free meme stream
         | connected to an algorithm trying to keep you "engaged."
         | 
         | If only a few people had fallen for this nonsense that would be
         | one thing, but I witnessed _mass conversions_ of millions of
         | people to ideas that are more absurd than the craziest
         | conspiracy bullshit I read on Usenet in the 1990s. This is
         | stuff the people who are crazy think is crazy.
         | 
         | It goes beyond shockingly bad ideas too. I've seen an alarming
         | rise of discourse that resembles word salad spewed out by a
         | primitive Markov chain text generator. It's terrifying. It
         | almost looks like exposure to open discourse is causing brain
         | damage in some susceptible fraction of the population. Some
         | subset of people seem to have lost the ability to process or
         | generate a coherent narrative or structured system of ideas.
         | They just spew memes in random order. It's less coherent than
         | classic new age babble or Orwellian "newspeak."
         | 
         | I still lean in the free speech direction pretty hard, but my
         | faith is shaken to the core. I honestly and non-hyperbolically
         | wonder if the right carefully crafted AI-powered social media
         | boosted propaganda campaign couldn't convert double digit
         | percentages of the population into homicidal maniacs or cause a
         | mass suicide wave killing millions.
         | 
         | BTW this and not "Skynet" is the scary thing about AI. The most
         | terrifying AI scenario I can think of is AI-assisted
         | demagoguery or mass stochastic terrorism powered by big data.
         | Think "adversarial attacks against the human neocortex."
         | 
         | Attacks only get better.
        
           | gedy wrote:
           | Flat Earth folks these days mostly seem to just delight in
           | being contrarian/trolls. I'm sure there's some unstable folks
           | in there like any belief system/group though.
        
           | mLuby wrote:
           | > I've actually wondered if most people might not need to be
           | protected from unregulated content in the same way that
           | people need to be protected from exposure to lead, radon gas,
           | etc.
           | 
           | Yes, but that means deciding _which_ content is harmful, and
           | that 's where we are now. Figuratively, you end up with lead-
           | lickers coming out of the woodwork saying that their way of
           | life is being stifled by regulation/moderation.
           | 
           | Chasing user "engagement" has been pushing conversations from
           | the mundane middle toward the fringes. Thus, people make
           | understandable but hasty generalization[0] that what they're
           | seeing is more common than average.
           | 
           | [0]: https://en.wikipedia.org/wiki/Faulty_generalization#Indu
           | ctiv...
           | 
           | In the past, I think this drift was counteracted by codes of
           | morality (whether internalized, reinforced by people you
           | know, or promulgated by regulatory bodies) as well as the
           | limited means of disseminating information (few newspaper
           | editors/radio announcers/news anchors to many
           | readers/listeners/viewers). Though I'm sure there were plenty
           | of wild pamphlets spreading chaos in the centuries between
           | Gutenberg and Zuckerberg.
           | 
           | Even though most of those morality codes are downright
           | oppressive by today's standards, and the many-to-many
           | distribution enabled by the Internet has many benefits, we
           | haven't found a substitute, so there's a gap in our armor.
           | 
           | Side note: Believing conspiracies and yearning totalitarian
           | are two different failures in thinking. I say that because
           | only the latter had strong support in the 20th century--even
           | earlier if you count monarchies. Someone supporting Flat
           | Earthers isn't harming me (except by undermining science in
           | general); someone supporting Stalin 2.0 is an indirect direct
           | threat to me.
        
             | thereisnospork wrote:
             | >Yes, but that means deciding which content is harmful, and
             | that's where we are now. Figuratively, you end up with
             | lead-lickers coming out of the woodwork saying that their
             | way of life is being stifled by regulation/moderation.
             | 
             | There might be something to be said for instead of limiting
             | speech, increasing the 'reporting requirements'. This isn't
             | a fully formed position of mine but rules along the lines
             | of no anonynomous speech[0] and stricter fraud[1][2] rules
             | are imo compatible with free speech and its ideals while
             | helping to manage fraud and propaganda.
             | 
             | [0] So if an AI wrote a blog spam post, it should be at
             | minimum illegal to not have the AI in the byline, with e.g.
             | a unique identifier.
             | 
             | [1] Say loudly and publicly that there is a pedophile ring
             | in the basement of a pizzeria, with no evidence, go to
             | jail.
             | 
             | [2] Not that such rules can't/haven't been abused before
             | though.
        
           | nescioquid wrote:
           | Chris Hedges sometimes remarks that he finds people turn
           | towards superstitious, religious, or conspiratorial world
           | views when they find they have no control over their lives. I
           | suspect that if the US somehow changed so that economic
           | security were increased for most people, there would be much
           | less unreason and general discourse wouldn't be so mean.
        
             | PerkinWarwick wrote:
             | That's an interesting theory, although I doubt that 19th
             | century US farmers, among the most self-sufficient people
             | in history, were lacking in religion, superstition, or
             | conspiracies.
        
               | nescioquid wrote:
               | In fairness, it is trivial to point to any number of
               | counter-examples throughout history if you just want to
               | dismiss the observation. Hedges is a journalist and was
               | referring to the changes he saw in the squeezed middle
               | and lower classes through his career, and probably wasn't
               | making a sweeping historic claim.
               | 
               | I mentioned it as contrast to the parent comment who
               | simply concluded people are incapable of rational
               | thought, and meant to suggest there are mediating
               | influences.
        
               | PerkinWarwick wrote:
               | > if you just want to dismiss the observation.
               | 
               | I'm not and I'm willing to give it a chance. I'm just not
               | seeing a strong correlation.
               | 
               | What I do see people doing when they lack control is to
               | attempt to gain control. Put together a group, grab some
               | of that sweet power through mass. The more ambitious ones
               | fight their way up the power structure. Special bonus
               | points if you can take over an existing seat of power.
        
               | nescioquid wrote:
               | > What I do see people doing when they lack control is to
               | attempt to gain control.
               | 
               | One way in which people gain (the feeling of) control is
               | to imagine the world differently (superstition,
               | conspiracy, religion) in a way that makes them virtuous
               | or special and others not (i.e. essentially Nietzsche's
               | idea of ressentiment). If you haven't the power to take
               | part in a real struggle, this is not so surprising to me.
        
               | PerkinWarwick wrote:
               | That's a good point.
               | 
               | Perhaps the temptation is to choose philosophies that
               | have the appearance of power. Sticking pins in a voodoo
               | doll of your boss, Mr. Scrooge for example.
               | 
               | Some acquaintances of mine did that at their job once and
               | it worked a charm. The boss was permanently off on sick
               | leave within a month. Maybe there's something to it.
        
               | PerkinWarwick wrote:
               | I usually go a hundred or so and then change up.
               | 
               | Honestly Mr. Dang, I'll probably just bail. HN is 50%
               | non-technical, it's easy to get drawn into that, nothing
               | of real value is said by anyone (including me).
               | 
               | Add that in with the downvote system, which is incredibly
               | irritating, and you simply have a Reddit knock-off.
               | 
               | It isn't like 'hackers' and 'startups' do much of value
               | anymore given the move in concentration from workstation
               | software and complex devices to moar and moar internet
               | software components and surveillance marketing companies.
               | 
               | Seriously, have a wonderful rest of the week, but make an
               | effort to do things of real value.
        
               | dang wrote:
               | Could you please stop creating accounts for every few
               | comments you post? We ban accounts that do that. This is
               | in the site guidelines:
               | https://news.ycombinator.com/newsguidelines.html.
               | 
               | You needn't use your real name, of course, but for HN to
               | be a community, users need some identity for other users
               | to relate to. Otherwise we may as well have no usernames
               | and no community, and that would be a different kind of
               | forum. https://hn.algolia.com/?sort=byDate&dateRange=all&
               | type=comme...
        
               | mullingitover wrote:
               | Farmers through all of history and to this day are
               | absolutely never self-sufficient. They are at the mercy
               | of 'the gods', aka the weather, at all times. One bad
               | flood, one freeze, one week of ill-timed rain, and they
               | are destitute. That's the perfect environment to breed
               | superstition.
        
               | jessaustin wrote:
               | _19th century US farmers_
               | 
               | They were self-sufficient in the sense that there was no
               | one coming to help in case of emergency. That doesn't
               | mean they ever felt secure. Entire families regularly
               | died for one or more of dozens of causes: starvation,
               | freezing weather, tornadoes, human sickness, livestock
               | sickness, crop failure, drought, dangerous animals,
               | Indian attack, crime, etc. Some of them might have
               | pretended at a "control over their lives", but few modern
               | Americans would trade places with them.
        
         | notchFilter54 wrote:
         | You merely have to look at what's happened to the second
         | amendment to see what could happen to the first (felons rights
         | gone, including anybody caught with a wrong sized lobster or
         | anybody who has lied to a federal agent. Regulated to oblivion
         | in places like Hawaii).
         | 
         | The tyranny of the majority has already decided it's OK to
         | require a license to exercise a right, in many US
         | jurisdictions.
        
         | greg7gkb wrote:
         | I'm super tired of people conflating civil rights + first
         | amendment protections with the idea that speech anywhere, by
         | anyone, on any platform, deserves to be protected.
         | 
         | The goal of the first amendment is to protect the citizens of
         | America from laws created by congress / government in limiting
         | speech. That is nowhere near what we are talking about when
         | we're talking about _any_ speech conducted on a private
         | platform.
         | 
         | In fact, it's interesting to me that the argument has recently
         | been spun around such that some politicians are claiming that
         | social platforms are violating their first amendment rights by
         | blocking or banning. This has nothing to do with the intent or
         | language of the first amendment.
         | 
         | In my mind, you can be the most fervent civil rights advocate
         | _and still_ believe that Twitter /Facebook/etc can ban anyone
         | they want for any reason. Even more so if you believe in free
         | enterprise and the rights of a business to act in the way that
         | they best see fit.
         | 
         | I understand that platform bans have more implications and
         | repercussions than I'm outlining here in simple terms but still
         | the conflation is frustrating to me.
        
         | klyrs wrote:
         | > 2020-2021 has shown me that most people happen to be 'fair
         | weather fans' of civil rights.
         | 
         | A close read of the constitution itself shows that this has
         | been the case since the very founding of the united states.
         | 
         | > The Privilege of the Writ of Habeas Corpus shall not be
         | suspended, unless when in Cases of Rebellion or Invasion the
         | public Safety may require it.
        
           | jaywalk wrote:
           | So because the Constitution defines some rather extreme
           | circumstances where Habeas Corpus is allowed to be suspended,
           | the Constitution is "fair weather" when it comes to civil
           | rights?
        
             | klyrs wrote:
             | Yes, rights are clearly revoked when the "weather" gets bad
             | enough. Today's argument is about where that "fair weather"
             | line is drawn.
        
             | dhanna wrote:
             | Do you have any familiarity with case law? The courts major
             | job is to deal with the shades of grey that's our reality.
        
               | jaywalk wrote:
               | Yes? I understand what courts do, my only point was that
               | setting out two extreme situations where a certain right
               | may be suspended doesn't constitute "fair weather" in my
               | eyes.
        
               | klyrs wrote:
               | The founders allowed for rights to be revoked when the
               | "weather" got extremely bad: agree or disagree?
        
               | jaywalk wrote:
               | Agree.
        
               | klyrs wrote:
               | I said earlier, the debate here is over what constitutes
               | "fair weather". Glad we agree.
        
         | quantumBerry wrote:
         | >2020-2021 has shown me that most people happen to be 'fair
         | weather fans' of civil rights
         | 
         | All you need to do is examine firearm rights, which have been
         | removed from felons and regulated into oblivion in places like
         | Hawaii to see what can happen to free speech. Both those
         | amendments are right next to each other and should apply to
         | everyone (who is not in jail for a crime), including Hawaiian
         | felons singing "somewhere under the rainbow" while smoking a
         | big fat blunt with a machine gun on their back. But the tyranny
         | of the majority has gotten away with putting their fears above
         | the civil rights of others.
        
         | clairity wrote:
         | the lesson here should be that we're not really much different
         | from our ancestors of 5-50k years ago and because of that,
         | concentrations of power (and by extension, influence) are
         | inherently dangerous. leaders with too much power will
         | inevitably make mistakes, and keeping institutions limited and
         | focused means the harms from those mistakes are limited in
         | scope.
         | 
         | in government, that means extending federalism: smaller
         | governing bodies loosely federated (primarily for mutual
         | protection and interrelational fairness). in business, it means
         | truly competitive (and fair) markets with diverse participants,
         | not oligarcal ones (like these platforms).
        
         | alex_c wrote:
         | The last few years have shown me how many people are willing to
         | believe utterly stupid things, and how easy it is to make
         | people turn against each other. I knew this in the abstract,
         | but watching it happen in real time is something else.
         | 
         | The human mind (myself included) has some serious bugs, and
         | "we" as a society - with help from technology - are getting
         | better and better at exploiting these bugs at scale. I don't
         | think censorship is a solution, but I don't know what IS a
         | solution.
         | 
         | Incidentally, what happened to FUD (Fear, Uncertainty, Doubt)?
         | I miss that term, and I think it's much more descriptive of
         | what we're seeing these days than the more vague
         | "misinformation".
        
           | swagasaurus-rex wrote:
           | News companies have known for centuries that Fear Uncertainty
           | and Doubt are profitable. They've shrouded the FUD behind
           | claims of professionalism and legitimacy.
           | 
           | The internet has made it instantly and continuously
           | accessible, and just as dangerously, largely fabricated. Even
           | news about things that actually happened can have its
           | comments astroturfed by bad faith arguments or straight up
           | lies.
           | 
           | Censorship doesn't solve the FUD; that will never go away
           | while there's a profit motive (IE, increase clicks).
           | 
           | Censorship can't distinguish between truth and lies, that's a
           | problem journalism used to solve when it was profitable.
           | 
           | Censorship does solve brainwashing. Is the tradeoff worth it?
           | Hard to say.
           | 
           | All I know is all platforms legally need to censor illegal
           | content, so it becomes a hammer looking for a nail.
        
           | tzs wrote:
           | > The last few years have shown me how many people are
           | willing to believe utterly stupid things, and how easy it is
           | to make people turn against each other. I knew this in the
           | abstract, but watching it happen in real time is something
           | else.
           | 
           | If anyone wants some good examples of this, go to Reddit and
           | read the posts /r/HermanCainAward that are marked "Awarded".
           | No need to read the comments there--they are often rather
           | mean. Just take a look at the submissions themselves.
           | 
           | For those not familiar with /r/HermanCainAward, the typical
           | submission is a gallery of screenshots of someone's social
           | media posts, usually full of memes about why they are not
           | masking/distancing/getting vaccinated and invariably ending
           | with them getting COVID, asking for prayers, and then someone
           | else announcing that the person has died and often asking for
           | donations to help their widow and/or children get buy
           | (because apparently the kind of person who feels that they
           | should get get all their COVID advice from stupid memes and
           | conspiracy theories is also the kind of person who does't
           | believe in life insurance...).
           | 
           | > The human mind (myself included) has some serious bugs, and
           | "we" as a society - with help from technology - are getting
           | better and better at exploiting these bugs at scale. I don't
           | think censorship is a solution, but I don't know what IS a
           | solution.
           | 
           | This too is illustrated nicely on /r/HermanCainAward. Before
           | all this I would have thought that if I needed to convince a
           | lot of people to make the kind of mistakes that the HCA
           | winners do I would need to carefully craft an individual plan
           | for each one of them. I would have never guessed that just
           | making a dozen or so memes would be enough.
        
       | beardyw wrote:
       | > Anonymity and pseudonymity have played important roles
       | throughout history, from secret ballots in ancient Greece to 18th
       | century English literature and early American satire.
       | 
       | Sorry, I think anonymity is valuable in the world we find
       | ourselves, but to suggest it was prevalent more than a few
       | decades ago is really stretching things. Yes there are examples,
       | but ordinary people just wouldn't do it. Now, to have say an
       | email which is your actual name is a rarity even if you wanted
       | it.
        
       | newbamboo wrote:
       | The cia wants big tech monopolies because we are in direct
       | competition with China. Mimetic desire to be dominant great
       | power. But the problem is monopolies suck. They stifle innovation
       | and make the culture sluggish and less dynamic. Where do we go
       | from here? Probably some new great power emerges that allows true
       | freedom and fosters bottom up innovation. I just wonder where
       | that will come from.
        
         | reedjosh wrote:
         | Decentralized and or federated social and value sharing
         | systems.
        
       | droptablemain wrote:
       | Anyone notice that liberal democracy seems to have a strong
       | reliance on illiberalism to maintain its stronghold?
        
       | seph-reed wrote:
       | Facebook, YouTube, and Twitter all _suggest_ this content to
       | people.
       | 
       | Anything they do passed this point is just them trying to cover
       | up the fact that they pushed extremism onto the world.
        
       | sebow wrote:
       | You don't have to force YT,FB,etc to host anything that "violates
       | their TOS",given that: you make the clear distinction in the
       | legislature:do american rights apply in the online space?If yes,
       | and you also consider YT,FB,etc. public places[which has already
       | been done by the courts], what's not to say that you MUST enforce
       | people's right on these platforms?That's the current debate and
       | it should either conclude in: yes those privately-owned places
       | are not public places thus they can do whatever they want, or
       | they are, in which case rights must be applied, to the letter.The
       | latter sounds more scary to non-US citizens because they might
       | see offensive material, but to make the distinction clear in the
       | law, a "bill of rights" should be made that states the rights
       | apply to this 'new' digital information medium aswell. This is
       | not "right to internet" as a "public-utility", because that's
       | mostly non-sense created for political gain amongst the young.The
       | long-term effects of current censorship will eventually set
       | things naturally: people will create off-grid mesh networks,
       | plugged into the mainstream internet,to fight off censorship, and
       | it will be out of some dystopian movie.This won't really be the
       | "web 3.0", because you're still relying on "centralized
       | pipelines" that tie everything together.
       | 
       | By the way, this will inevitably also go into property/copyrights
       | issue, and the law is shaky there aswell(arguably even more
       | so,compared to freedom of expression).The "i store these bits on
       | my machine but i don't own copyright of it" is a big issue and
       | frankly something disturbing.I would rather see copyrights
       | holders enforce streaming(where you don't "store" their content
       | entirely at any given time) rather than continue with this DRM
       | mess and everything that comes with it.
        
       ___________________________________________________________________
       (page generated 2021-09-29 23:02 UTC)