[HN Gopher] Social Media First Amendment Cases
       ___________________________________________________________________
        
       Social Media First Amendment Cases
        
       Author : Mjadams
       Score  : 64 points
       Date   : 2024-02-26 17:11 UTC (1 days ago)
        
 (HTM) web link (www.lawfaremedia.org)
 (TXT) w3m dump (www.lawfaremedia.org)
        
       | btilly wrote:
       | It isn't often that I see a case which really could change the
       | internet. Particularly eye-opening was
       | https://www.supremecourt.gov/DocketPDF/22/22-277/292540/2023... -
       | an amicus brief from the moderators of the subreddits r/law and
       | r/SCOTUS. It brought home how different a world with laws like
       | this could be. And I had to laugh at pages 15-22. "We aren't
       | going to argue the law. We're just going to show you the content
       | that we are moderating." Followed by screenshots of attacks, many
       | of them death threats, aimed at the Supreme Court. That's one way
       | to get a judge's attention!
       | 
       | Arguments were heard yesterday. Luckily
       | https://www.npr.org/2024/02/26/1233506273/supreme-court-soci...
       | suggests that the justices were skeptical of the Florida and
       | Texas laws.
       | 
       | Arguments are being heard today.
       | https://www.oyez.org/cases/2023/22-555 links to a current
       | recording.
        
         | michaelt wrote:
         | Wow, that's a great find! You don't even have to read beyond
         | the table of contents:-
         | 
         |  _> TABLE OF CONTENTS [...]
         | 
         | > ARGUMENT ............... 6
         | 
         | > Amici Censor Irrelevant and Inappropriate Speech to Cultivate
         | Healthy Online Communities Built on Common Interests. This
         | Includes Removing Death Threats Aimed at Members of This Court.
         | ........... 9
         | 
         | > Amici Could Be Sued for Censoring Internet Trolls Who Are
         | Calling for the Execution of Supreme Court Justices. ........
         | 19
         | 
         | > Those Who Are "Censored" by Amici Can Speak Elsewhere.
         | ........... 22_
        
         | japhyr wrote:
         | That was a fantastic read. My favorite excerpt:
         | 
         | > These laws are not about protecting speech. They're about
         | politicians ensuring that a favored constituency has access to
         | someone else's megaphone to spread a message
         | 
         | I was a high school teacher for a long time. When explaining to
         | students how I facilitated discussion in the classroom, I
         | talked with them about the idea of the loudest voice. If you
         | don't moderate a discussion thoughtfully, it will be taken over
         | by the few people with the loudest voices.
         | 
         | We see this same dynamic all the time in online communities.
         | Moderators quiet the loudest voices so everyone else can have a
         | meaningful conversation. These laws are aimed at letting the
         | loudest people shout everyone else down, so the people trying
         | to hold onto power can't be challenged.
        
           | sanity wrote:
           | > They're about politicians ensuring that a favored
           | constituency has access to someone else's megaphone to spread
           | a message
           | 
           | I think this is an inaccurate characterization. Over the past
           | 20 years content creators were encouraged to use and invest
           | in building an audience on these platforms based on the tacit
           | understanding that they would behave like content neutral
           | common carriers. If the platform's willingness to abuse this
           | power had been apparent from the start they would never have
           | become so powerful.
           | 
           | People created their own "megaphones" over many years only
           | for the platform to start dictating to them how they can
           | communicate with the audiences they've built, often based on
           | naked political animus. It was a bait and switch - a fraud.
           | 
           | That said, having listened to most of the oral arguments I
           | think SCOTUS will strike down these laws.
           | 
           | The fundamental problem is that the Internet centralizes
           | power in a small number of corporations which then become
           | ripe for capture by powerful interests, whether political,
           | ideological, or commercial.
           | 
           | IMO the ultimate solution is a decentralized Internet like
           | https://freenet.org/ that doesn't require creators to hand
           | over control of their voices to powerful third parties.
        
             | nobody9999 wrote:
             | >Over the past 20 years content creators were encouraged to
             | use and invest in building an audience on these platforms
             | based on the tacit understanding that they would behave
             | like content neutral common carriers.
             | 
             | That's a strong assertion. Do you have any _evidence_ to
             | back that up?
             | 
             | AIUI, the _law_ governing this (at least in the US -- which
             | is relevant since we 're discussing US state laws being
             | challenged in the courts) is the Communications Decency Act
             | of 1996[0], with section 230[1] of that law being the
             | pointy end of the stick WRT these particulars. I would also
             | point out that the above serves as a front-end (in that it
             | allows the courts to reject lawsuits misdirected at
             | platforms, when they should be directed at the source of
             | the speech) to the First Amendment[2].
             | 
             | Please detail where, exactly, any of the above supports the
             | claim that "content creators were encouraged to use and
             | invest in building an audience on these platforms based on
             | the tacit understanding that they would behave like content
             | neutral common carriers."
             | 
             | Perhaps I'm missing something important (which is certainly
             | possible) and I'd appreciate being enlightened. As such, I
             | look forward to your response.
             | 
             | [0]
             | https://en.wikipedia.org/wiki/Communications_Decency_Act
             | 
             | [1] https://www.techdirt.com/2020/06/23/hello-youve-been-
             | referre...
             | 
             | [2] https://en.wikipedia.org/wiki/First_Amendment_to_the_Un
             | ited_...
             | 
             | Edit: Cleaned up prose.
             | 
             | Additional Edit: I'm responding to [3] below as I am
             | apparently rate limited and wanted to make sure I was clear
             | in what I'm trying to say as I am pretty passionate about
             | it:
             | 
             | That's as may be, and I remember it fondly too.
             | 
             | However, it wasn't by government or the law that "content
             | creators were encouraged to use and invest in building an
             | audience on these platforms..."
             | 
             | Rather it was those platforms themselves. It would be easy
             | for me to just say "sucker! you've been rooked!" and leave
             | it at that.
             | 
             | However, I'll (attempt to) respond substantively by saying
             | that the surveillance capitalism[4] business model was
             | still nascent and there were many more options for
             | interpersonal interaction back then.
             | 
             | It was inevitable that after the consolidation of "social"
             | media that advertiser influence (since they're actually the
             | customers -- not you) would be paramount to these
             | platforms.
             | 
             | And that's what drives the moderation/censorship folks are
             | complaining about -- because advertisers don't want to be
             | associated with anything controversial, they just want you
             | to buy their products/services and (the advertisers and the
             | platforms as well as various middlemen) collect (ala [4])
             | as much information about you as they can to aid in that
             | process.
             | 
             | [3] https://news.ycombinator.com/item?id=39527216
             | 
             | [4] https://en.wikipedia.org/wiki/Surveillance_capitalism
        
               | hamhock666 wrote:
               | I think to most people on the internet in the early
               | 2000s, it was unthinkable that content on the major
               | platforms would be censored based on political content. I
               | don't think there is any evidence for that, it's more of
               | a general feeling, and the fact that political censorship
               | took off in the next decade.
               | 
               | If these platforms had started out censoring particular
               | political content, they would not have had the same mass
               | adoption, or there may have been more free-speech
               | competitors.
        
               | sanity wrote:
               | Agreed, the overt political censorship didn't start until
               | the mid-2010s, over a decade after today's tech giants
               | solidified their dominant positions.
        
               | joshuamorton wrote:
               | I'd potentially reverse this: 'overt' political
               | censorship didn't start until certain creators started to
               | market their otherwise objectionable content as
               | "political" as a way to bypass existing moderation
               | policies.
        
               | numpad0 wrote:
               | It's App Store. It all started with the App Store.
        
               | Zak wrote:
               | Advertisers are also considerably more picky about the
               | content they're willing to be associated with, and
               | considerably more influential with regard to the content
               | most people see online than they were 20 years ago.
               | Advertiser demands have produced rapid policy changes
               | from the likes of Youtube and Reddit where other forms of
               | pressure have had little effect.
        
               | ImPostingOnHN wrote:
               | I agree with the person you're responding to, that the
               | assertions you're making here don't seem to be supported
               | by the evidence:
               | 
               | As one of those people in question, it was never
               | unthinkable. Even during the 2000's, it was common to ban
               | people from MSN chatrooms. Before that, it was common to
               | ban people from IRC channels, individual IRC servers, or
               | even entire IRC networks. Did the GNAA have a right to
               | "be platformed" indefinitely wherever they wanted?
               | 
               | Also, the "censorship" in question more often deals with
               | things like insults and other incivility, spamming, death
               | threats, etc. _Not_ mere  "political" speech. I doubt
               | many folks were banned from /r/SCOTUS for politely saying
               | _" I politically disagree with this ruling"_. We see via
               | screenshots in the amicus the sort of stuff that was
               | _actually_ moderated.
               | 
               | Indeed, the "censorship" which spawned these laws was the
               | banning of a dude actively calling for violent
               | insurrection against the government, and receiving it,
               | and continuing to encourage it _during_ the violent
               | insurrection. That 's the "political speech" the bill
               | authors had in mind when drafting it. It's possible that
               | the "censorship" in question is all that stopped the
               | putsch from succeeding. One can see why fans of said dude
               | and his insurrection were so upset by that.
        
               | hamhock666 wrote:
               | I think you're right that people have been banned from
               | things as long as the internet has been around. However
               | more tools exist today to censor, things like AI
               | generating and reading subtitles for YT demonetization
               | and algorithm deranking.
               | 
               | If someone makes a threat on a person or groups life
               | online, or doing something illegal, then I agree it
               | shouldn't be allowed. But censorship today goes far
               | beyond that.
               | 
               | The purpose of the platform matters. MSN chats (group
               | chats?) and sub-reddits are smaller places presumably set
               | up by another user for a specific purpose. I have no
               | problem with people being banned/censored for whatever
               | reason from these smaller forums.
               | 
               | I have an issue with censorship when the platform is
               | generic, not dedicated to a particular topic or group,
               | like Twitter, YouTube, or Reddit as a whole. When one or
               | more dominant third party platforms censor the same
               | people, it has an effect similar to that of government
               | censorship.
               | 
               | I also think the censorship will backfire, because by
               | being shut down, it gives power to the ideas being
               | censored. "There must be a reason they are shutting down
               | discussion. They have no real answer to it!"
               | 
               | I agree that online harassment is ugly, but I still
               | believe in absolute free speech on these generic
               | platforms. The best solution to all of this would be to
               | have block lists that people could opt-in or out of.
               | Don't want to see something? Subscribe to the block list.
        
               | ImPostingOnHN wrote:
               | _> more tools exist today to censor, things like AI
               | generating and reading subtitles for YT demonetization
               | and algorithm deranking_
               | 
               | I don't see how that justifies the government compelling
               | you or I or IRC chanops or subreddit moderators or
               | Twitter admins to say what the government wants us to
               | say. None of those were used to ban the former president
               | when he was engaging in the "political speech" of
               | inciting a violent insurrection and encouraging it while
               | it was happening.
               | 
               |  _> If someone makes a threat on a person or groups life
               | online, or doing something illegal, then I agree it
               | shouldn't be allowed. But censorship today goes far
               | beyond that._
               | 
               | And yet, the speech the government is trying to compel
               | here includes, but is not limited to: insults; slurs;
               | obscenity; spam; inciting violence; inciting
               | insurrection; death threats; and more.
               | 
               |  _> The purpose of the platform matters_
               | 
               | Does it, though? That seems like an arbitrary line drawn
               | to avoid logical inconsistencies. Who defines what the
               | purpose is? Who defines how it matters?
               | 
               |  _> The best solution to all of this would be to have
               | block lists that people could opt-in or out of. Don't
               | want to see something? Subscribe to the block list._
               | 
               | Is it, though? This part of the post you replied to,
               | bears repeating:
               | 
               |  _> Indeed, the  "censorship" which spawned these laws
               | was the banning of a dude actively calling for violent
               | insurrection against the government, and receiving it,
               | and continuing to encourage it during. That's the
               | "political speech" the bill authors had in mind when
               | drafting it. It's possible that the "censorship" in
               | question is all that stopped the putsch from succeeding._
               | 
               | Blocklists wouldn't have prevented it. If you or I or
               | Twitter don't want to aid and abet violence and
               | insurrection, the government should not be able to compel
               | us to do so.
        
               | sanity wrote:
               | >>Over the past 20 years content creators were encouraged
               | to use and invest in building an audience on these
               | platforms based on the tacit understanding that they
               | would behave like content neutral common carriers.
               | 
               | > That's a strong assertion. Do you have any evidence to
               | back that up?
               | 
               | I was there, I remember it. I co-founded a pioneering but
               | ultimately unsuccessful online video company around 2006
               | called Revver that had to deal with moderation issues
               | quite early on, YouTube, Vimeo, Google Video were our
               | competitors. The idea that we would use this power in a
               | politically biased way never occurred to us, it was
               | _obvious_ that it would be unacceptable, it was
               | unconscionable.
               | 
               | And we weren't the exception, we were the norm that
               | included companies like YouTube and Reddit. I think
               | anyone working on a user-generated content startup at
               | that time would tell you the same thing.
        
               | creato wrote:
               | > I co-founded a pioneering but ultimately unsuccessful
               | consumer-facing video company around 2006 called Revver
               | that had to deal with moderation issues quite early on.
               | The idea that we would use this power in a politically
               | biased way never occurred to us, it was obvious that it
               | would be unacceptable, it was unconscionable.
               | 
               | In 2006, the internet was probably dominated by western
               | audiences. Now, the user base of any large social media
               | site includes much more of the world, including groups
               | like Hamas, ISIS, etc. and they would absolutely claim
               | their content is political.
        
               | sanity wrote:
               | > In 2006, the internet was probably dominated by western
               | audiences. Now, the user base of any large social media
               | site includes much more of the world, including groups
               | like Hamas, ISIS, etc. and they would absolutely claim
               | their content is political.
               | 
               | After 9/11 radicalization on the Internet was a serious
               | concern - but most people seemed to understand that
               | freedom of speech meant tolerating speech you didn't
               | like.
               | 
               | For example, just a year after 9/11 the New York Times
               | published a letter by Osama bin Laden explaining his
               | views and motivations. That's almost impossible to
               | imagine today (when apparently the NYT will fire a writer
               | for admitting they like Chick-fil-A).
        
               | vkou wrote:
               | > After 9/11 radicalization on the Internet was a serious
               | concern - but most people seemed to understand that
               | freedom of speech meant tolerating speech you didn't
               | like.
               | 
               | Maybe in the filter bubble of a comp sci student lounge.
               | 
               | You've forgotten how everyone else lost their mind in
               | 2001, and rallied behind a folksy strongman, and his
               | cabal of stooges, useful idiots, and bad Boyars.
               | 
               | Most people at the time seemed to believe that finding
               | and exterminating every terrorist (Alleged or otherwise)
               | the government could get its hands on would be the only
               | way we could remain free. They hate us for our freedom,
               | and all that. Things like the PATRIOT act have support
               | outside the nerd-sphere, and the NSA revelations were a
               | nothingburger to most people, and torturing people in Abu
               | Ghraib and Gitmo, well, they are all guilty anyways.
               | 
               | The only reason those views and motivations get any
               | airtime is so they can be attacked (without opportunity
               | for debate or rebuttal). Had his values and goals had
               | been less comically antithetical to ours, or if he wasn't
               | condemned by his own actions as an irredeemable monster
               | (which further damns anything he has to say), you
               | wouldn't have seen a whiff of them. We only hit those we
               | can beat.
        
               | sanity wrote:
               | > Maybe in the filter bubble of a comp sci student
               | lounge.
               | 
               | My example of the NYT publishing Osama bin Laden had
               | nothing to do with comp sci student lounges, can you
               | imagine they NYT publishing something like that today?
               | The equivalent might be an op-ed by Vladimir Putin, it
               | wouldn't happen. The views of the legacy media have
               | shifted dramatically on free speech over the past two
               | decades, particularly over the past 8 years.
        
               | philipkglass wrote:
               | By the time the United States Senate held its "Jihad 2.0"
               | hearing in 2015 [1], social media companies were already
               | working to prevent ISIS and other terrorist groups from
               | spreading their messages or media. If there was any
               | pressure from government and the general public, it was
               | to the effect that social media companies weren't
               | censorious _enough_ regarding this sort of content.
               | 
               | I personally fall on the "maximal free speech" side of
               | preferences. If an ISIS video isn't calling for imminent
               | lawless action [2], then I don't want YouTube deleting
               | it. But I don't think that YouTube should be legally
               | _required_ to keep hosting an ISIS video that is
               | technically legal under the First Amendment, and I also
               | understand that my preference is unpopular. I 'd be
               | surprised if more than 10% of Americans agreed with me
               | when I say that an ISIS member should be able to publish
               | repugnant, bigoted, propagandistic, even _violent_ videos
               | so long as those videos are legal under First Amendment
               | standards.
               | 
               | I personally want minimal filtering on this sort of
               | content because:
               | 
               | 1) I'm in no danger of actually being recruited to a
               | terrorist cause.
               | 
               | 2) Primary sources, like combat videos from the Syrian
               | civil war (including propaganda videos that compile such
               | clips) can provide more information about ongoing
               | conflicts, combatants, and world events than general
               | purpose news outlets can provide.
               | 
               | I think that most people are against dissemination of
               | this sort of content because point 2 is irrelevant to
               | them (they're not news junkies closely following armed
               | conflicts) and, stochastically speaking, a few people who
               | see terrorist propaganda videos will be persuaded to take
               | up terror. Or maybe it's even less calculating: the
               | reasoning could be as simple as "ISIS has a disgusting
               | ideology, and disgusting ideologies shouldn't get free
               | speech considerations."
               | 
               | The extreme breadth of speech protected under the First
               | Amendment is more than what the median American wants
               | protected, and spans more than what social media
               | companies want to provide hosting for. High minded
               | commitments to free speech like news outlets publishing a
               | letter by Osama bin Laden in 2002 [3] are more notable
               | for their rarity than for exemplifying a general standard
               | of the time.
               | 
               | [1]
               | https://www.hsgac.senate.gov/hearings/jihad-20-social-
               | media-...
               | 
               | [2] https://en.wikipedia.org/wiki/Brandenburg_v._Ohio
               | 
               | [3] https://en.wikipedia.org/wiki/Letter_to_the_American_
               | People
        
               | jcranmer wrote:
               | > I personally fall on the "maximal free speech" side of
               | preferences. If an ISIS video isn't calling for imminent
               | lawless action [2], then I don't want YouTube deleting
               | it.
               | 
               | An online video is almost by definition incapable of
               | being incitement to imminent lawless action. The
               | threshold for this magic phrase is roughly along the
               | lines of leading a mob and saying "there is <member of
               | outgroup>, lynch them." Pretty much anything short of
               | that is constitutionally-protected free speech under the
               | First Amendment, and it's virtually impossible for an
               | internet video to be sufficiently imminent for the
               | purposes of incitement to imminent lawless action.
        
               | philipkglass wrote:
               | Interesting, I didn't realize that the "imminent"
               | component disqualified recordings. I would have thought
               | that an exhortation to do something lawless as soon as
               | you hear the message would count. Something like "Here's
               | the home address of a Senator who called for military
               | action against the Islamic State -- go burn his house
               | down now!"
        
               | jcranmer wrote:
               | > I would have thought that an exhortation to do
               | something lawless as soon as you hear the message would
               | count.
               | 
               | That is almost literally the opposite of what Brandenburg
               | v Ohio held! (Many people tend to do this: they assume
               | that the phraseology is meant to limit speech along the
               | lines of Schenck v US or Whitney v California, what is
               | now known as the "clear and present danger" standard,
               | which is what Brandenburg v Ohio was explicitly
               | overturning.)
               | 
               | Brandenburg v Ohio held that advocating violent overthrow
               | of the government is free speech, but you need the very
               | direct link between the speech and the illegal actions
               | for the speech to become illegal. The emphasis in
               | "incitement to imminent lawless action" of Brandenburg is
               | meant to be "imminent", and later court cases have
               | generally held that "imminent" is meant to be read in
               | very short timespans.
        
               | ImPostingOnHN wrote:
               | _> stochastically speaking, a few people who see
               | terrorist propaganda videos will be persuaded to take up
               | terror_
               | 
               | Empirically speaking, we saw more than a few of these
               | people on January 6th.
        
             | ToucanLoucan wrote:
             | > Over the past 20 years content creators were encouraged
             | to use and invest in building an audience on these
             | platforms based on the tacit understanding that they would
             | behave like content neutral common carriers.
             | 
             | Beyond what the other (excellent) comment had to say about
             | this, why does what content creators may have believed mean
             | fuck-all in terms of what these cases are arguing? Nobody
             | would have used Facebook if they thought Facebook would
             | "censor" them?
             | 
             | And besides that, your assertion that it's a bait and
             | switch implies that all these creators would've been able
             | to foster the exact same followings using, what, their own
             | websites? Accounts on different social networks? Dubious at
             | best. A lot of the aforementioned creators benefited
             | strongly from the network effects of those platforms and
             | the algorithms bringing their content to new followers.
             | It's likely if they didn't grow up on whichever platform
             | they picked, they wouldn't have grown at all.
        
               | sanity wrote:
               | > why does what content creators may have believed mean
               | fuck-all in terms of what these cases are arguing?
               | 
               | I didn't say it did, I acknowledged that SCOTUS would
               | probably rule against the TX and FL laws. I'm talking
               | about the broader context.
               | 
               | > Nobody would have used Facebook if they thought
               | Facebook would "censor" them?
               | 
               | I think if people knew then what they know now about how
               | these platforms would behave starting around 2015 they
               | would have had a much more difficult time achieving their
               | dominant positions, there would have been _far_ more
               | skepticism.
               | 
               | Of course it's hard to prove a counterfactual but that's
               | my view having been a small part of it.
        
               | hellojesus wrote:
               | > I think if people knew then what they know now about
               | how these platforms would behave starting around 2015
               | they would have had a much more difficult time achieving
               | their dominant positions, there would have been far more
               | skepticism.
               | 
               | Maybe. The polarization of the country has increased
               | significantly since the covid era.
               | 
               | Plus most of this censorship discourse really only occurs
               | at the edges. 99% of my meatspace social network cares
               | about seeing puppy vidoes or whatever Taylor Swift has to
               | say. Exactly zero of them have any expectation that their
               | posts will ever be censored because they don't post
               | anything oppositional.
               | 
               | I think that, even with censorship, void a real
               | alternative competitor, the networks would have grown all
               | the same. The only difference is maybe TruthSocial or
               | Rumble would have spun off a few years earlier.
        
             | Sohcahtoa82 wrote:
             | I'd support Freenet if it wasn't for their massive CSAM
             | problem. I absolutely hate that I know what "PTHC" stands
             | for.
             | 
             | Maybe it's changed. I haven't touched Freenet since...I
             | think 2010? I just remember seeing index pages with links
             | to tons of sites on Freenet, and a considerable number of
             | them were links to CSAM.
             | 
             | Nope. Deleted that. Ain't gonna be a part of that.
        
               | sanity wrote:
               | I find that surprising, even in 2010 it was difficult to
               | find illegal content on Freenet unless you were looking
               | for it - and certainly in recent years it's virtually
               | impossible, the default indexes are carefully vetted.
               | 
               | In any case, the original Freenet was never going to be a
               | general-purpose replacement for today's centralized
               | services as it can only handle static content. For the
               | past few years we've been working on a sequel to Freenet
               | that is much more general-purpose, you can learn about it
               | at https://freenet.org/. While the original Freenet was
               | analagous to a decentralized hard drive, the new Freenet
               | is like a decentralized computer.
               | 
               | As part of it we're building a decentralized reputation
               | system (based on a "web of trust") to address
               | illegal/offensive/objectionable content.
        
               | at_a_remove wrote:
               | I'm guessing it isn't Parent-Teacher Home Conference.
        
               | MikeTheGreat wrote:
               | Not to embarrass you doing a nearly effortless Google
               | search for the answer, but it's clearly Percutaneous
               | transhepatic cholangiography /s [1]
               | 
               | On a more serious note - part of me really wants to know,
               | and a much larger part of me doesn't.
               | 
               | [1] https://en.wikipedia.org/wiki/Percutaneous_transhepat
               | ic_chol...
        
               | at_a_remove wrote:
               | Amusingly, I nearly had that procedure done.
        
               | duskwuff wrote:
               | For better or worse, the answer to your question is in
               | the header of the Wikipedia page you linked.
        
               | pbhjpbhj wrote:
               | The half of HN without proper impeccable impulse control
               | are probably on a list now. You could have said "hate
               | that I learnt lingo used by abusers, just by being on
               | Freenet".
               | 
               | For those wondering, the initialism relates to extreme
               | child abuse. And now I too get to wish I hadn't looked.
        
               | A4ET8a8uTh0 wrote:
               | FWIW, I am relatively certain a good chunk of HN users
               | are marked as "mostly harmless" already. In other words,
               | we are all on the list. We just have different tags:D
               | 
               | edit: Or at least, if I was responsible for running IC,
               | we would all be on the list with appropriate tags.
               | Thankfully, I am not.
        
             | jcranmer wrote:
             | Mike Masnick wrote the speed run guide to content
             | moderation here: https://www.techdirt.com/2022/11/02/hey-
             | elon-let-me-help-you...
             | 
             | In short, the problem is that there is no universally
             | agreed-upon definition of "problematic" content. No matter
             | what definition you think is reasonable for moderation,
             | there will both be people who complain that it is too
             | permissive and people who complain that it is too
             | restrictive. At a large enough scale, people on both
             | extremes of the spectrum will be complaining about your
             | moderation policies--and at very large scales, there will
             | be politicians to listen to them!
        
               | sanity wrote:
               | There's definitely a slippery slope when you start with
               | centralized moderation, which seems to be the article's
               | central argument.
               | 
               | With the new Freenet, we're using a decentralized
               | moderation system that lets users control what they see
               | or don't see, based on sensible defaults. This builds on
               | a concept from the original Freenet called web-of-
               | trust[1]. I believe this method, combined with the fact
               | that centralized censorship is impossible in the new
               | Freenet due to its lack of centralized control, offers a
               | strong alternative.
               | 
               | [1] https://github.com/hyphanet/plugin-
               | WebOfTrust/blob/master/de...
        
               | echelon wrote:
               | > In short, the problem is that there is no universally
               | agreed-upon definition of "problematic" content.
               | 
               | I got banned from my city's subreddit for making a mild
               | criticism about crime once. Now I'm shut off from events,
               | networking, etc.
               | 
               | I'm from the southeast, which is a conservative area of
               | the country. The internet I grew up on had thicker skin,
               | wasn't trigger happy with banning, and because of that I
               | was exposed to ideas and perspectives that I wouldn't
               | have ordinarily been in contact with. People didn't have
               | a disgruntled disposition that immediately banned
               | opposing views on sight, so I was able to soak in so much
               | information.
               | 
               | Expressly because of this, I was able to change my shape
               | from a conservative kid into a well-rounded moderate. I
               | don't think I could replicate this experience easily on
               | today's internet. People are divided into factions and
               | are quick to mute and ban those they disagree with. (The
               | rage-centric algorithms also heighten confrontational
               | language and showdowns, but that's another issue.)
               | 
               | Censorship almost always leads to imbalanced power
               | dynamics and gatekeeping. It's become a tit-for-tat tool
               | of retaliation and playing "gotcha". People enjoy
               | digitally flicking each other off by banning them. It's
               | super fucked.
               | 
               | Filtering is important, but we should have protocols for
               | that where individuals have complete control and
               | discretion. I don't know how that's supposed to work on
               | platforms that are wholly controlled by profit-seeking
               | corporations, but maybe legislation can craft a way
               | forward.
               | 
               | I really hope we don't dive deeper into the censorship
               | rabbit hole. It genuinely terrifies me more than any
               | other issue facing humanity.
        
               | AnthonyMouse wrote:
               | > In short, the problem is that there is no universally
               | agreed-upon definition of "problematic" content. No
               | matter what definition you think is reasonable for
               | moderation, there will both be people who complain that
               | it is too permissive and people who complain that it is
               | too restrictive.
               | 
               | But this is rather the point. What you need is a wall
               | between hosting and discovery.
               | 
               | Alice has some rather controversial things to say, Bob
               | wants to hear about them and Caren doesn't. So Bob and
               | Caren should be using different filtering systems that
               | they can choose for themselves. What _shouldn 't_ happen
               | is that Alice gets banned from the internet or shadow
               | banned on a dominant platform with a network effect just
               | because Caren doesn't like her, because Caren should only
               | be deciding for Caren and not Bob.
               | 
               | This doesn't mean Caren has to design her own filtering
               | system. She can use someone else's. But she should have a
               | _choice_ in which one to use, _independent of the
               | underlying platform_ , and so should Bob.
               | 
               | Right now we tie the filtering system to the thing with
               | the network effect and then have Zuckerberg or Musk
               | deciding for everybody when it should be everybody
               | deciding for themselves.
        
               | firejake308 wrote:
               | The problem is that it's an awful lot of work for
               | everyone to decide for themselves, and I don't think the
               | average user is going to care enough to build their own
               | moderation algorithm. They'll just complain and move on
        
               | AnthonyMouse wrote:
               | > _This doesn 't mean Caren has to design her own
               | filtering system. She can use someone else's._
               | 
               | There can be one filter that only blocks spam and scams,
               | another that blocks conservatives, another that blocks
               | spam, scams and celebrity gossip, another that only shows
               | technical content etc. You open the config options and
               | pick one, like you choose an ad blocker or a radio
               | station. If you don't like it you pick a different one.
               | It shouldn't be that hard.
        
           | nradov wrote:
           | What you say is true only in an idealized fantasy world. We
           | might sometimes get close to that ideal in small communities
           | such as a classroom. But in the real world, moderators on
           | large online platforms constantly abuse their power to push
           | preferred narratives and suppress dissent rather than
           | facilitating meaningful conversations.
        
           | fallingknife wrote:
           | But this doesn't fundamentally change the situation. It just
           | effectively makes the moderator the loudest voice. This is
           | not necessarily a bad thing, e.g in your classroom example
           | with a teacher moderating a bunch of kids works very. When it
           | is the executives of a few large corporations moderating
           | everybody, I'm not so sure.
        
           | saurik wrote:
           | I think there is a big difference between "I am using Twitter
           | and thereby I expect random people to get my tweets promoted
           | by their algorithm" and "I am using Twitter and expect that
           | the people who actively went out of their way to follow me
           | will get to see my content" or even "I am using Twitter and
           | expect that the data I gave them in trust to store won't
           | arbitrarily be deleted without first attempting to return it
           | to me" (along with a minimum retention period and potentially
           | some requirement to hand over the data to the state rather
           | than destroy it... note that these are the kinds of
           | requirements we legally place on rental storage agreements).
           | Most of the functionality of these social media platforms are
           | NOT similar to a forum, and if I chose to follow someone it
           | makes no sense for me to whine about the things they say: I
           | can and should just unfollow them.
        
           | almatabata wrote:
           | In person two people cannot talk at the same time hence one
           | person can monopolize the discussion unless moderated.
           | 
           | The same thing does not necessarily happen on social media,
           | at least not in the same way. Two people can talk and post at
           | the same time without preventing each other from
           | communicating to their audience. And you can filter obnoxious
           | users should you wish to do so.
           | 
           | But if the platform bans people or deletes posts because they
           | discuss things they disagree with politically, you prevent
           | the audience from accessing content they might find
           | interesting. In this case the moderation is trying to
           | shutdown the debate not really facilitating it.
           | 
           | You need moderation, but the big platform have clearly shown
           | that they have trouble resisting the temptation of abusing
           | it.
        
         | psunavy03 wrote:
         | Somewhat ironic considering the mods of those subs have
         | deliberately banned anyone who disagrees with their political
         | slant and stickied a post at the top of one of them stating
         | words to the effect of "this is not the place to be wrong and
         | belligerent about it."
         | 
         | But I guess a bunch of people that full of themselves are full
         | enough of themselves to think that SCOTUS is going to care
         | about an amicus written by a bunch of anonymous Internet
         | people. Sure, they moderate death threats, but they also
         | moderate away anything they might have to disagree with. And
         | then pull stunts on the order of "I'll reinstate your account
         | if you write a 500-word essay on why you're wrong."
        
       | rdtsc wrote:
       | > In their briefs responding to Texas and Florida, the platforms
       | argued that both laws infringe on their First Amendment rights by
       | preventing them from exercising editorial discretion--which they
       | view as protected expressive conduct--in deciding when and how to
       | disseminate speech. The platforms asserted that the laws would
       | prevent them from removing problematic content, such as terrorist
       | propaganda.
       | 
       | Interesting take. So the platforms, as entities, want them some
       | first amendment speech rights. And of course, it's all been about
       | the terrorists. Thank you Google, Facebook and others, for keep
       | us safe from those villains.
        
         | lgleason wrote:
         | If they want editorial discretion, then they are
         | publishers...no section 230. If they want to be common carriers
         | then they should not be making editorial decisions.
         | 
         | Ironically, if they had not taken steps as egregious and
         | suspending the account of the current president of the US these
         | laws would never have been considered by FL or TX.
        
           | dingnuts wrote:
           | section 230 does allow platforms the ability to remove
           | unwanted content
        
           | throwawa14223 wrote:
           | This seems so obviously correct to me. The issue seems to be
           | big companies wanting to have it both ways and screwing
           | everyone else.
        
           | michaelt wrote:
           | Imagine if you time-travelled back to 2002, before Reddit and
           | Facebook, when the internet had thousands of unconnected
           | phpBB forums.
           | 
           | I happen to think the Porsche Cayenne is an ugly midlife-
           | crisis-mobile driven by assholes. Should the Porsche Owners
           | Club phpBB forum be permitted to censor my speech, when I
           | want to share my opinion that Cayenne owners can suck a fat
           | one?
           | 
           | In 2002 the answer would be obvious - it's no problem if they
           | censor me, there's like 1000 other forums I can go to.
           | 
           | Are things really so different these days?
        
             | mrguyorama wrote:
             | It shouldn't matter even if there truly is nowhere else for
             | you to go. There were zero forums in 1860 and yet we still
             | figured out how to call each other race traitors in
             | pamphlets. The first amendment right is to not have the
             | government police your speech (often nowadays extended
             | beyond speech to influence), it does not give you a
             | positive right to force anyone else to spread your speech.
             | 
             | More than that, forcing someone else to spread your speech
             | when they don't want to is a DIRECT violation of their
             | first amendment right, of the "government forcing you to
             | say things" kind.
             | 
             | You cannot make "free speech is an ideal" without breaking
             | someone else's free speech.
        
           | ryandrake wrote:
           | These companies want their cake and they want to eat it, too.
           | They want to be treated as a publisher when it comes to
           | making editorial decisions, but they also want to be treated
           | as a dumb pipe when it comes to liability for the content
           | that they publish. They're Schrodinger's Forums: Both
           | publishers and non-publishers, depending on which one helps
           | them.
        
           | icandoit wrote:
           | The point of 230 was to eliminate the "publisher"
           | distinction. Otherwise, you can't eliminate spam withouth
           | accepting liability for whatever psycho comments.
           | 
           | If anyone wants to delete spam or kick people off their own
           | servers, why should they be denied?
           | 
           | https://www.eff.org/deeplinks/2020/12/publisher-or-
           | platform-...
        
           | nobody9999 wrote:
           | >If they want editorial discretion, then they are
           | publishers...no section 230. If they want to be common
           | carriers then they should not be making editorial decisions.
           | 
           | If any post in this discussion called for it, this one
           | does[0][1].
           | 
           | [0] https://news.ycombinator.com/item?id=25697823 (9 January,
           | 2021 -- 439 points, 304 comments)
           | 
           | [1] https://www.techdirt.com/2020/06/23/hello-youve-been-
           | referre...
        
           | MisterBastahrd wrote:
           | What, particularly, was egregious about suspending the
           | account of someone who violated their TOS a hundred times
           | over?
        
       | notaustinpowers wrote:
       | My concern, ultimately, is what's going to happen when these laws
       | are introduced and a republican congress continues to pursue a
       | rejection of Section 230?
       | 
       | So social media platforms are no longer able to censor any
       | content posted to their sites (outside of specific cases), but
       | are also liable for what users post to their platform?
       | 
       | If I was a conspiracy theorist I'd say these actions are
       | specifically to make social media itself inhospitable in a legal
       | sense. What company wants to take the risk of not being able to
       | moderate user-generated content but are also legally responsible
       | for that same content?
        
         | cuckatoo wrote:
         | Do we need to protect the social media platforms? Do they even
         | add value to society?
        
           | dingnuts wrote:
           | I'm sure the spooks agree that information was a lot easier
           | to control when it was just ABC, NBC, and CBS
        
           | notaustinpowers wrote:
           | Do you think this would stop at the big boys like Facebook
           | and Twitter?
           | 
           | What about Reddit? If I post a story in r/AITAH that the
           | other person discovers, this opens Reddit up to a possible
           | libel lawsuit if the other person thinks I'm not being
           | truthful. Do you think that's a risk that Reddit's legal
           | department will let them take? I highly doubt it. Do you
           | think unpaid volunteer mods want to be legally liable if they
           | remove a post for violating guidelines?
           | 
           | Let's go another level deeper, your neighborhood's Nextdoor
           | forum. If Susan from down the road starts saying I slept with
           | her husband to bad mouth me on the app, I could not just sue
           | Susan for libel, but I can also include Nextdoor in the
           | lawsuit. Do you think Nextdoor's legal team will allow that?
           | Again, nope. But Nextdoor wouldn't be able to remove Susan's
           | post because it would be "silencing her freedom of speech".
           | So instead, Nextdoor takes the logical legal decision to
           | completely remove any user-generated posts, effectively
           | killing their product.
           | 
           | And another level deeper, some random classic car forum with
           | maybe 100 users per month. What do they do? I think you get
           | the hint.
           | 
           | This isn't just going to affect social media platforms. This
           | will affect all platforms that allow users to post in any
           | capacity, text, photos, videos, links, etc.
        
           | simpletone wrote:
           | > Do they even add value to society?
           | 
           | Your presence here is your answer. Why are you here if social
           | media doesn't provide any value?
           | 
           | You'd have to be braindead to not understand the immense
           | value social media has added to society. But like all things,
           | social media has it's negative aspects. But just because
           | social media isn't perfect, doesn't mean it has no value. As
           | I said, your presence here is proof of that.
        
             | tarxvf wrote:
             | I guess I'm braindead.
             | 
             | Their presence here does not indicate they value their time
             | here in a net-positive way. It offers no proof whatsoever.
             | 
             | They could be addicted. FOMO could drive them to
             | compulsively check the site. It might be their only social
             | interaction at all during the workday. This might be the
             | only group of people on the internet with similar
             | interests.
             | 
             | None of those things would necessarily make it a net-good
             | thing, if it also has negative repercussions that outweight
             | the benefits. Many drugs, legal and illegal, are fairly
             | harmful. Sometimes the benefits are worth the negatives.
             | Sometimes they are very firmly not. Much of the anti-
             | social-media position is about social media being addicting
             | and net-harmful.
             | 
             | I personally don't see the immense value that you see. I've
             | seen some value for some specific sites for some short,
             | specific times. My grandma could interact with some of her
             | grandkids for a short while on Facebook, for instance. Of
             | course, she (or we) used to just pick up the phone, which
             | is what happens now too. I've seen some cool projects on HN
             | I'd otherwise likely never have seen. Otherwise I'm drawing
             | a blank.
        
               | simpletone wrote:
               | > I guess I'm braindead.
               | 
               | Lets see shall we.
               | 
               | > None of those things would necessarily make it a net-
               | good thing
               | 
               | Who is talking 'net' here? The commenter simply
               | questioned whether social media added any value to
               | society. To deny social media has provided any value to
               | society is as braindead as denying that fossil fuels
               | added value to society. Now whether the negatives
               | outweight the positives ( aka net value ) is an entirely
               | different question.
               | 
               | > My grandma could interact with some of her grandkids
               | for a short while on Facebook, for instance. Of course,
               | she (or we) used to just pick up the phone, which is what
               | happens now too. I've seen some cool projects on HN I'd
               | otherwise likely never have seen.
               | 
               | Oh so it does provide value. So you are agreeing with me
               | then?
               | 
               | > Otherwise I'm drawing a blank.
               | 
               | You aren't braindead. You are disingenuous. So you never
               | asked for or search for information on reddit, hn,
               | stackoverflow, etc. You never found solutions to problems
               | on tiktok, youtube, etc? You don't know anyone who found
               | a job via hn, linkedin, etc? Bought or sold stuff on
               | facebook, etc?
               | 
               | If you believe that the negatives of social media
               | outweigh the positives of social media, then fine. That's
               | your opinion. But to cavalierly dismiss or deny that
               | social media provides value to society is being
               | disingenuous at best or braindead at worst. Or more
               | likely agenda driven nonsense.
               | 
               | Your comment reminds me of this excellent monty python
               | clip: What have the romans done for us?
               | 
               | https://www.youtube.com/watch?v=Qc7HmhrgTuQ
        
         | lgleason wrote:
         | This is why we can't have nice things. The social media
         | platforms took things too far. Most republicans would be in
         | favor of the platforms not being liable for content if they
         | don't censor. Most and both political sides would probably also
         | agree that they should be taking steps to protect minors from
         | child predators etc., and that was the intent behind 230
         | protections. The social media platforms took 230 beyond what is
         | was initially intended for as a premise to weaponize it against
         | people who's politics they don't agree with. Obviously with
         | that kinds of a loophole 230 was not well thought out from that
         | regard and would need a re-write.
        
           | nobody9999 wrote:
           | >The social media platforms took 230 beyond what is was
           | initially intended for as a premise to weaponize it against
           | people who's politics they don't agree with.
           | 
           | So what? And I mean that quite seriously.
           | 
           | To clarify, I despise the big social media platforms and
           | refuse to use them even if that inconveniences me (and
           | occasionally it does). At the same time, those platforms have
           | the same free speech rights that I do, and curtailment of
           | _their_ free speech (and property) rights (in this case, the
           | right _not_ to publish /amplify the speech of others) also
           | curtails _my_ free speech (and property) rights.
           | 
           | And since I want to protect my own rights, I support the
           | _right_ of those rapacious scumbags to moderate /censor on
           | their own _private property_.
        
             | ThrowawayTestr wrote:
             | If platforms are going to be editors then they don't need
             | section 230 protection.
        
             | someuser2345 wrote:
             | You're right, these companies do have the right to censor
             | speech that they disagree with. However, if they do that,
             | then they are responsible for any speech that they do not
             | censor. So for instance, if someone posts a libelous
             | statement on Reddit, and Reddit doesn't remove that
             | statement, then victim of libel should be allowed to sue
             | Reddit directly, instead of just suing whoever made that
             | post.
        
               | pbhjpbhj wrote:
               | I think I'd agree if the circumstances are the same, and
               | on request a libel was not taken down for you when it was
               | for other people. The same circumstances including being
               | on the same subreddit, being caught by the same auto-
               | filter, being notified in the same way, at least.
               | 
               | So, for example if libel against you wasn't caught by a
               | particular filter then that's not bias against you, and
               | so they are not party to the libel (unless you can show
               | the filter was specifically designed to fail to catch
               | libels against you).
        
           | Analemma_ wrote:
           | > The social media platforms took 230 beyond what is was
           | initially intended for as a premise to weaponize it against
           | people who's politics they don't agree with.
           | 
           | So what? You're taking it as a given that "if a website
           | moderates content in a way I don't like, the government must
           | step in and force them to do moderation in a different way I
           | do like"; I don't accept that premise at all, it's quite
           | totalitarian in both concept and real-world execution.
        
           | shadowgovt wrote:
           | So it turns out, inconveniently, "Children need protection
           | from predators" and "People need protection from bigots" are
           | two sides of the same coin.
           | 
           | It's all politics and it always was. Claiming one is "common
           | sense" and one is "political" misses what "political" means.
           | 
           | I think S230 set a pretty good tone in biasing in favor of
           | the right of the service owner to set the tone over-and-above
           | within the constraints of the law (for a simple practical
           | reason: if you don't give them that right, they'll just stop
           | providing the service). But that does mean that when your
           | provider decides that, say, debating the humanity of trans
           | folks is no longer acceptable, we toe the line there or we
           | start our own service.
        
             | notaustinpowers wrote:
             | Thanks for your input on this, especially for S230. I never
             | understood the concept that a private business has to
             | entertain or host content that they do not want on their
             | site.
             | 
             | If a friend invited me to their house I can start calling
             | their wife a fat cow and tell them how ugly I think their
             | children are. It doesn't mean they have to put up with it
             | or accept that sort of speech on their private property.
             | That doesn't mean he censored me or inhibited my freedom of
             | speech.
        
               | tiltowait wrote:
               | Well, did your friend invite you there saying his house
               | was a place for you to share your mind?
               | 
               | For me, it's an issue of scale. Your friend is a single
               | person (or family). Twitter is a gigantic, faceless
               | corporation (okay, Elon Musk makes it less faceless, but
               | you know what I mean) that tries to cater to everyone--in
               | effect, it tries to be a commons. Can we consider it one?
               | Should we? Should we make a law that says once a social
               | media company gets to a certain size, it can't censor
               | anything anymore if that content is legal?
               | 
               | I'm generally against additional regulation. I don't
               | think, for instance, a small pro-life forum should be
               | forced to allow pro-choice people to spew vitriol, nor
               | vice versa. I'm hesitant to say the same for a giant
               | company like Twitter or Facebook. There, I think it might
               | be more appropriate to have comprehensive filtering and
               | self-moderation tools vs. shutting people out completely
               | (assuming their behavior is legal).
        
               | notaustinpowers wrote:
               | A lot of these issues stem from the fact that, until
               | recently, society has never had to grasp these issues.
               | Our laws are not equipped to handle these sorts of
               | questions. Especially when that question ultimately is
               | "When does a private company become a public service and,
               | therefore, must abide by the laws and regulations
               | applicable to all public services?"
               | 
               | My statements, while I myself do not necessarily agree
               | with them, are what I view as possible when operating
               | within the current legal framework our government has
               | built for these private companies.
               | 
               | To me, if the government wants to hold a private company
               | to the standard of a public service, then that private
               | company must fully, legally, and entirely become
               | (somewhat) a public service. I view that as becoming a
               | service similar to the United States Post Office.
               | 
               | It's allowed to continue to operate as a company but has
               | to comply with government regulations (whatever those may
               | be). That also means that its goal is not profit
               | generation. It can still charge users for certain
               | services if it wishes but is no longer able to sell user
               | data, and it must remain revenue-neutral.
        
               | dpkirchner wrote:
               | This is a fair analogy but I think it could go further:
               | your friend shouldn't be required to take what you say,
               | print it out, and hand copies to everyone else that comes
               | in to their house. They can say no, you can't do this
               | here and I won't repeat it.
        
           | downWidOutaFite wrote:
           | Most Republicans want to censor just different things, just
           | look at all the book bannings across the country. There are
           | tons of things that lefty people complain about being
           | censored on social media but we don't here about it because
           | left-wing media is nowhere near as powerful as right-wing
           | media. (The main lie in a lot of these discussions is that
           | corporate media is lefty.)
        
           | kmeisthax wrote:
           | CDA 230 exists because the Wolf of Wall Street was trying to
           | censor evidence of his crimes. He got one ruling against
           | Prodigy saying moderation makes you liable for defamation;
           | and another from Compuserve saying no moderation means no
           | liability.
           | 
           | Let me be very clear about how the law would work sans CDA
           | 230: any time someone does not like what you are saying, they
           | can sue the platform you host it on to get you censored. The
           | only platforms resistant to this would be ones full to
           | bursting with spam. This is already really bad. If you want a
           | partial repeal, i.e. one where platforms are still allowed to
           | "protect minors from child predators", I'm not sure that'll
           | pass muster at SCOTUS. Selectively removing speech
           | protections based on content is a no-go.
           | 
           | Furthermore, platforms being able to take down political
           | speech they disagree with is not a "loophole". That's just
           | what moderation _is_. The whole point of a moderator is to
           | silence the loudest voices, so that others may speak.
        
           | dragonwriter wrote:
           | Section 230 was written without limitation on the kind of
           | censorship allowed because it was an end-run around
           | Constitutional prohibition of government content favoritism
           | that had destroyed previous censorship laws and waa
           | (correctly) suspected would endanger much of the rest of the
           | law it was incorporated into.
           | 
           | If you want to allow the kind of private censorship S230 was
           | intended to protect, and stay within the First Amendment,
           | S230 is what you get.
        
           | mrguyorama wrote:
           | 230 isn't a loophole, it's a requirement for any business on
           | the web. You cannot even have product reviews without
           | something like 230.
        
         | vik0 wrote:
         | >to make social media itself inhospitable
         | 
         | I see nothing wrong with that. Social media has done nothing
         | but harm its users, and people that know its users (or:
         | everybody)
        
           | halfmatthalfcat wrote:
           | > Social media has done nothing but harm its users
           | 
           |  _Nothing_? I don 't think it's that cut and dry. While there
           | has been harm, no doubt, there has been a lot of positives
           | too (maintaining connections, thoughtful conversations, etc).
        
           | NavinF wrote:
           | Then why did you write this comment on HN (an example of
           | social media)?
        
           | carlosjobim wrote:
           | Then why are you writing here on this social media and
           | harming people?
           | 
           | Social media is such a big part of modern communications,
           | that arguing against it is like arguing against telephone
           | lines or the printing press. There's a lot to criticise the
           | owner companies for, including how social media can harm
           | people and infamously promote and accelerate genocide, such
           | as Facebook is accused of having done in Myanmar. But the
           | same accusations can be levied against any means of
           | communication and publishing technology.
        
           | notaustinpowers wrote:
           | Thanks for your comment here on the social media site, Hacker
           | News!
           | 
           | I think we have to reorient ourselves on what counts as
           | social media. It's not just Facebook, Twitter, and Instagram.
           | It's forums like Hacker News, it's comment sections on your
           | favorite blogs you frequent. It's the products available on
           | Etsy or a small creator's personal Shopify store.
           | 
           | Free speech isn't just about photos and text, it covers all
           | forms of expression.
        
             | kmeisthax wrote:
             | We used to just call these forums. Social media is a subset
             | of forum where the posts are presented in a personally
             | curated timeline of some kind.
        
               | Goronmon wrote:
               | I would argue forums are just a subset of social media.
               | "Social media" being the high level umbrella for
               | platforms where users interact. Differentiated from
               | platforms which are "read-only".
        
           | vik0 wrote:
           | I knew I would get the replies that I got. It reminds me of
           | this meme: https://i.kym-
           | cdn.com/entries/icons/original/000/036/647/Scr...
        
         | tekla wrote:
         | Why single out Republicans?
         | 
         | The Democrats hate Section 230 as much as the republicans.
        
           | michaelt wrote:
           | The Florida and Texas laws were both passed by Republican
           | legislatures. And Biden and Obama have not yet been banned by
           | Twitter.
        
           | DinoDad13 wrote:
           | what?
        
         | DoodahMan wrote:
         | i share your tinfoilery: the goal is to erode public discourse
         | on the internet, censorship (self or otherwise), and so on.
        
         | hellojesus wrote:
         | > My concern, ultimately, is what's going to happen when these
         | laws are introduced and a republican congress continues to
         | pursue a rejection of Section 230?
         | 
         | It's not a mystery. Look what happened when sesta/fosta became
         | law. Craigslist had to dump their entire personal section for
         | fear they miss a single ill intentioned post.
         | 
         | The consequence of 230 removal is only to destroy the ability
         | for people to interact with one another publicly online unless
         | one of them (or the platform) is willing to take on liability
         | for the interaction.
        
       | troyvit wrote:
       | > As distasteful as this content may be, it is protected by the
       | First Amendment. But that protection only extends to government
       | actors. Amici are private actors, and the forums they control are
       | private forums. Those who are censored are free to make their own
       | websites to host their speech. They are not free to hijack
       | amici's websites.
       | 
       | This is the crux of so much happening on the internet right now.
       | Users treat our largest providers like google like public
       | resources similar to roads, and governments want to treat our
       | largest forums like twitter and facebook like government entities
       | beholden to the same rules they are.
       | 
       | Neither is true, and to me it points to the massive impact our
       | largest companies have been able to achieve. It seems larger than
       | what the law has words for.
       | 
       | I don't know what the answer is.
        
         | dingnuts wrote:
         | the answer is probably standards and thoughtful regulation to
         | enforce them, but the chances that today's Congress can produce
         | those things are slim to none imho
        
           | vik0 wrote:
           | What is thoughtful to you is likely not to be thoughtful to
           | somebody else
           | 
           | What is generally accepted as thoughtful in one world region,
           | will likely not be considered to be generally thoughtful in
           | another world region
           | 
           | Furthermore, the concept of "thoughtfulness" may not exist in
           | some world regions -in fact, it may be a concept in a
           | minority of world regions
           | 
           | Should these thoughtful regulations (whatever they may be)
           | only apply to denizens of a certain region or regions, or
           | everybody in the world?
        
             | mrguyorama wrote:
             | Uh, they should apply to the 350 million Americans that
             | this government represents.
             | 
             | What is your point? Iran is free to tell Google to take a
             | hike if Google censors their calls to violence. Germany is
             | free to tell id software to remove the swastikas or hit the
             | road.
             | 
             | This isn't hard. Don't pretend it is.
        
             | numpad0 wrote:
             | Just build a Gestapo this time with proper controls and
             | oversights. That's weird, but if there are multiple mob
             | groups doing the same, some even foreign, that's a threat
             | to any free nation and their power shall be transferred to
             | rightful governments.
             | 
             | dc: not a US person
        
               | paulddraper wrote:
               | > Just build a Gestapo this time with proper controls and
               | oversights
               | 
               | Where Poe's and Godwin's laws intersect.
        
         | sanity wrote:
         | > Amici are private actors, and the forums they control are
         | private forums
         | 
         | Except we now know through the twitter files and other
         | disclosures that government agencies were intimately involved
         | if not the driving force behind much of the censorship,
         | although this isn't relevant to the SCOTUS case.
        
         | MisterBastahrd wrote:
         | The answer is simple: if the USSC would be against
         | nationalizing these services, then they should also be against
         | attempting to restrain their ability to moderate their own
         | content. Nobody with even a pair of brain cells can logically
         | conclude that actors like Google and Facebook have ever been
         | either benign or neutral, nor should courts pretend that an
         | inability to understand those facts should be burdens for said
         | networks.
         | 
         | Just because, as Chaya Raichik complained the other day, that
         | she should be able to say what she wants because "there's no
         | law against lying," doesn't mean that platforms have to
         | necessarily allow harmful bullshit either.
        
           | troyvit wrote:
           | I think I'm down for this in the abstract, but that's because
           | I'm not a powerful voice that has been censored by one of
           | these big players, and I also am glad for most of the
           | censorship I've seen them do.
           | 
           | But if I step out of my shoes and look at Trump for instance,
           | does he have a legitimate grievance for being kicked off of
           | Twitter if he actually _did_ think that he fairly won the
           | election and was deplatformed because he tried to speak about
           | it?
           | 
           | If Twitter at the time was as powerful as a government in
           | regulating speech ... should it have to follow the same
           | rules? If the playing field for social media was more level,
           | it wouldn't be an issue. He can just go to another provider
           | and have his speech.
           | 
           | Still, re-reading what you said I have to agree that the end
           | goal should be people understanding that Google and Facebook
           | (and Twitter) are not benign or neutral and never will be.
        
         | TulliusCicero wrote:
         | > I don't know what the answer is.
         | 
         | Well, one obvious option would be the government making its own
         | competitors to Twitter/Youtube/et al.
         | 
         | Yes yes, I realize there's a bunch of issues there, like how
         | the government would REALLY have to permit virtually any kind
         | of speech, or general technical incompetence from government
         | agencies in running such a site. But it _could_ be done, there
         | 's nothing actually stopping it.
         | 
         | I actually think a government-run Twitter could work okay and
         | be accepted by people IF they didn't do any kind of algorithmic
         | recommendations, sorting, or even have a search function at
         | all. You could see tweets/content that you got to via external
         | link, you could go onto that person's page to see a
         | chronological list of things they've said, they could have a
         | manually created profile page that links to others, but no
         | curation of any kind by the platform itself. The home page
         | would be mostly blank, or maybe only have a list of official
         | government accounts or something.
        
         | ploxiln wrote:
         | > Those who are censored are free to make their own websites to
         | host their speech.
         | 
         | I'm very sympathetic to this argument. Seems fair to me.
         | 
         | But then Cloudflare, and any host big enough to withstand DDoS
         | attacks, is strongly pressured by seemingly most people on the
         | Web and in the US, to kick "bad websites" by "bad people" off
         | their platform. So we can't just let bad people have their own
         | website which we don't visit? I kinda wish we could. If most
         | people hate that bad people can have public websites, just say
         | that this is the best we can do, at least they're not on your
         | parent's facebook/twitter.
        
           | paxys wrote:
           | There is a very valid argument in favor of treating ISPs and
           | hosting providers the same as your electricity and water
           | company, and prevent them from being able to ban users on a
           | whim. However the companies that are currently being targeted
           | are still a few levels removed from that. In fact the party
           | advocating for this is also the one opposed to net
           | neutrality.
        
           | kevingadd wrote:
           | Cloudflare is a weird example to pick here, since kicking
           | i.e. stormfront or kiwifarms off CF doesn't deprive them of
           | the ability to host a website, it just deprives them of
           | obfuscation and a free CDN - effectively, the service CF is
           | offering to them is not hosting but cost reduction and
           | insulation from the consequences of their speech.
           | 
           | It's reasonable in an abstract sense to think that it's a
           | good thing if CF offers those Bad People services like that
           | based on philosophical goals or political alignment, but it's
           | very distinct from 'having a public website'.
           | 
           | AFAIK it's still possible for anybody to put up a linux box
           | with nginx on it and put any content they want there, other
           | than the fact that a lot of consumer ISPs don't allow you to
           | run servers anymore. But that's a different problem and
           | cloudflare can't fix it.
        
             | spondylosaurus wrote:
             | > effectively, the service CF is offering to them is not
             | hosting but cost reduction and insulation from the
             | consequences of their speech.
             | 
             | This is somewhat tongue-in-cheek on my part, to be clear,
             | but this raises an interesting point about whether
             | orchestrating a DDoS attack is a form of free speech. I'm
             | inclined to say "yes" more than "no."
             | 
             | (You could draw a parallel, for example, to counter-
             | protestors who try to drown out Westboro Baptst Church
             | picketers by holding up their own signs....)
        
             | numpad0 wrote:
             | It's not weird, it's just an example of how the modern
             | society treat anything with sufficient scale as public
             | resources just by scale and reach, and how little it cares
             | about private corporate rights.
             | 
             | You run the water to a town and the town now owns it. If it
             | didn't, the town regulates it to the point you're
             | essentially owned by the town, and the result is the same.
        
           | troyvit wrote:
           | Good example with Cloudflare, and they offer an interesting
           | flip side with Project Galileo, where they offer their
           | Business tier product for free to groups they deem
           | vulnerable:
           | 
           | > Any qualified vulnerable public interest site can seek
           | participation in Project Galileo. Examples of participants
           | include, but are not limited to, minority rights
           | organizations, human rights organizations, independent media
           | outlets, arts groups, and democracy protection programs.
           | 
           | The place I work for qualifies, and it has kept us afloat.
           | It's another example where they change the landscape for a
           | group they select (I'm just super glad they did).
           | 
           | [1] https://www.cloudflare.com/galileo/
        
       | nradov wrote:
       | Supreme Court Justice Clarence Thomas has suggested that Congress
       | consider extending Common Carrier legislation to cover online
       | services. Essentially this would force them the allow all legal
       | content, sort of like a telephone company.
       | 
       | https://www.npr.org/2021/04/05/984440891/justice-clarence-th...
        
         | 2OEH8eoCRo0 wrote:
         | If that happened what would the repercussions be?
         | 
         | I think they'd come together and create a certifying authority
         | for authentic users with traceability. More compliance. Better
         | filtering/admin tools for users.
        
           | nradov wrote:
           | I doubt it. That wouldn't provide any legal protection or
           | cover to social media companies. Their main concern is with
           | maintaining a positive brand image which keeps regular users
           | and advertisers onboard. So, they don't want to host legal
           | but offensive content that would decrease user engagement or
           | drive advertisers away.
           | 
           | Forcing users to certify and authenticate themselves would
           | drive a lot of users away and thus devastate advertising
           | revenue. And many users will happily post offensive content
           | using their real identities.
        
           | bilbo0s wrote:
           | No repercussions, because being a common carrier is
           | voluntary.
           | 
           | What I mean is, you can have a members only club that is not
           | a common carrier. For instance a member of the hypthetical
           | video snippet network SnipWit can communicate only to other
           | SnipWits. This means that SnipWit is not a common carrier.
           | They are obviously private, as you can only communicate to
           | other members. Worse yet, there is obviously consideration
           | required prior to use, which means they would be able to add
           | even more draconian terms to their membership requirements.
           | 
           | I'll give a hypothetical. CostCo is members only. You're not
           | a member, you can't get groceries there. Full stop. It
           | doesn't matter if it's the only grocer in your area, you'll
           | have to drive to find a Kroger. And to illustrate what I mean
           | by the ability to take things further, CostCo could add
           | racial exclusivity requirements to their membership clauses.
           | Blacks would then be barred from shopping there like they are
           | barred from certain country clubs. And it would be totally
           | legal and well within the rights of CostCo, those country
           | clubs, or the hypothetical SnipWit. Why? Because these are
           | all private organizations and members only.
           | 
           | I guess what I mean is, a lot of people talk about Common
           | Carrier being the solution while forgetting about the
           | private/public aspect and distinction at play in those
           | regulations. We normally entice organizations to become
           | common carriers by offering them goodies on the other side of
           | that. Like indemnifications for instance. But what happens
           | when you have organizations that are already fat, happy and
           | growing like weeds under their "private with membership"
           | umbrella? What do you offer them? It's a tough problem.
           | There's a lot of people and shareholders making a lot of
           | money in the current model. Those people will almost
           | certainly vote their shares _against_ becoming a common
           | carrier unless there is more upside in it for them somehow.
        
             | nradov wrote:
             | You have misinterpreted the case law. Although Costco
             | requires a membership to shop there, courts have generally
             | found such places to be public accommodations rather than
             | true private clubs. Thus, they would be legally barred from
             | instituting a racial exclusivity requirement for
             | membership. (I am only commenting on the strict legal issue
             | here; obviously racial discrimination is morally wrong.)
             | 
             | https://www.cbsnews.com/minnesota/news/good-question-why-
             | can...
        
           | icandoit wrote:
           | Would that mean that my poltical screeds (or spam) can't be
           | deleted from your social media website for dogs?
           | 
           | That you would have continue to host and serve whatever
           | content I publish? Even if your userbase only interacts with
           | my content to hide it?
        
         | shadowgovt wrote:
         | Anyone who has been dealing with phone scams should have a
         | healthy sense of dread on the idea that internet services
         | should achieve parity of quality with phone service.
        
       | AnarchismIsCool wrote:
       | Something to keep in mind, the way we view a lot of these cases
       | is fundamentally different that the way platforms look at them.
       | We care about a free-speech vs moderation debate whereas the
       | platforms care about "can we make more money from our advertisers
       | this quarter?". These are fundamentally misaligned and are the
       | source of a lot of the friction between "publishers"/"carriers"
       | and breathing humans.
       | 
       | The set of things advertisers will accept is wildly different
       | than the set of things we accept because they believe that their
       | brand is being associated with whatever content is on the
       | platform.
        
       | ysofunny wrote:
       | the free speech rights of giant companies VERSUS the free speech
       | rights of the users of the products of them giant corporations
       | 
       | the plot twist is how the figurative judge presiding over this
       | contest is the literal will to the power of dictating how people
       | ought to think
        
         | shadowgovt wrote:
         | Freedom of speech and freedom of the press are both protected
         | by the First Amendment but are separable rights.
         | 
         | It doesn't infringe one's free speech rights if a paper (or
         | media service) refrains from transiting one's opinion to their
         | readers. Start one's own press.
        
           | ysofunny wrote:
           | i'm pointing at how this is an issue between corporations
           | exerting their rights and individual users exerting theirs
           | 
           | but you divert the focus towards the technicalities of
           | specific rights
           | 
           | further, this is not about "press", social media is different
           | enough from "the press" that it should be treated accordingly
        
             | shadowgovt wrote:
             | It's not "technicalities;" these rights only exist in a
             | legal sense _because_ of the Constitutional protections.
             | 
             | Remove those protections and the rule doesn't become
             | "Everyone gets to say whatever on Twitter;" they become
             | "Twitter can tell everyone to pound sand, can lie, can
             | commit fraud, can pretend to be other people and edit your
             | messages in transit," etc. Without the law, it's might-
             | makes-right and the corporations _definitely_ control the
             | wires and the databases.
             | 
             | I argue that forcing the corporations to transit bits of
             | various users infringes upon their rights more than the law
             | traditionally demands (and morality requires, since the
             | existence of a corporation doesn't immediately infringe
             | anyone else's right to start their own website).
             | 
             | > social media is different enough from "the press" that it
             | should be treated accordingly
             | 
             | How so?
        
               | ysofunny wrote:
               | > How so?
               | 
               | network effects
        
       | devaiops9001 wrote:
       | Mike Benz spilling the beans here tells you everything you need
       | to know about censorship and who has their hand up who's ass
       | causing the pandemic of censorship to happen.
       | 
       | https://rumble.com/v4dtxtu-everything-you-need-to-know-about...
        
         | shadowgovt wrote:
         | https://www.youtube.com/watch?v=CRYSKaS-XtQ
        
         | kevingadd wrote:
         | Hard for me to take this seriously when it opens with Tucker
         | Carlson talking to me after he just finished doing a
         | promotional campaign for Vladimir Putin, and your language in
         | this comment + lack of details isn't helping. What do I need to
         | know about censorship that he's going to tell me? What is this
         | "pandemic of censorship"? Who is Mike Benz?
         | 
         | Wish it was text so I could scan through it to see if it has
         | any merit.
        
       | 2OEH8eoCRo0 wrote:
       | It's such a tricky situation and there are so many questions (and
       | opinions)!
       | 
       | What was the intent of Section 230?
       | 
       | Part of the text reads:                   (3)to encourage the
       | development of technologies which maximize user control over what
       | information is received by individuals, families, and schools who
       | use the Internet and other interactive computer services;
       | 
       | Is this happening?
        
       | hellojesus wrote:
       | > The court cited several cases in support of this position,
       | including Pruneyard Shopping Center v. Robins. In that case, the
       | Supreme Court held that California could permissibly require a
       | shopping mall to allow individuals to distribute pamphlets inside
       | the premises--reasoning that a business establishment that holds
       | itself open to the public has no "expressive" interest in who is
       | allowed to speak in the establishment.
       | 
       | I don't understand how this can be true. If I have a bunch of
       | hooligans handing out pamphlets to everyone that enters my
       | private shopping mall such that the content on the pamphlets is
       | actively deturing shoppers from conducting business in my
       | privately owned establishment, I have a very compelling interest
       | in the "expressive" content of the pamphlet.
       | 
       | Aside, this is akin to forcing my business to allow people to
       | sling racial insults at shoppers with no recourse like kicking
       | them out.
        
         | joshuamorton wrote:
         | You're reading a lot I to the case that wasn't true. In
         | pruenyard, the pamphleters weren't being disruptive.
        
           | pbhjpbhj wrote:
           | Pamphleteers who didn't hand out pamphlets? Does they only
           | have pallets to peddle who went over to them and asked for
           | one? Otherwise they would be disruptive.
        
           | hellojesus wrote:
           | I'm generalizing the result of the case. Which is applicable
           | so long as the current lawsuit references it as supporting
           | evidence.
        
       | justinzollars wrote:
       | I like David Sacks perspective on SS230. Yes, there is a huge
       | bias in Silicon Valley against conservatives. But if we get rid
       | of 230 it will get even worse. The moderators are all liberal.
       | Personally I think the bias is generational, and will work it
       | self out with time.
        
         | DinoDad13 wrote:
         | Science is biased against conservatives. Literature is biased
         | against conservatives. Logic is biased against conservatives.
        
         | kevingadd wrote:
         | This bias is overstated, if it's even true at all. There are
         | plenty of conservatives in tech with lots of money and power,
         | many of them run hosting companies or CDNs. Trump is easily one
         | of the most reviled figures out there right now and he's had no
         | problem operating an entire social media service of his own,
         | for example. And conservative politicians do just fine - big
         | tech CEOs and founders show up to meetings with them and make
         | donations all the time.
        
       | kmeisthax wrote:
       | Conservatives are finding out the hard way what liberals were
       | complaining about a decade ago with Net Neutrality and Comcast's
       | argument that charging Netflix up the ass was their free speech
       | expression.
       | 
       | That being said, these bills are very, very bad ideas. Let's be
       | clear: no broadcasting medium can work without some mechanism to
       | censor spammers. And this role has to be specialized (rhymes with
       | 'centralized') because nobody wants to spend most of their day
       | online just manually selecting spam to be blocked. If these laws
       | are upheld I can see companies moving to block Texans and
       | Floridians to protect the rest of their country from their
       | legally mandated political spam.
       | 
       | Let's also appreciate that a good chunk of the speech
       | conservatives want to 'protect' is political speech explicitly
       | calling for things prohibited by the 1st Amendment. Shit like
       | banning an entire religion. I personally don't think that should
       | be allowed, though I doubt this particular court's 6-3
       | conservative majority would go along with censoring the censors.
        
       | paxys wrote:
       | Funny to see people here cheering on these laws (because
       | something something _free speech_ ) while simultaneously enjoying
       | one of the most productive yet also heavily moderated social
       | media sites on the web (HN). Things are already bad enough in the
       | country with evangelicals dictating what books we can read and
       | how we are allowed to have sex. I don't want them controlling the
       | internet as well.
        
         | kevingadd wrote:
         | HN is quite lightly moderated compared to most forums I can
         | think of. I see lots of boundary-pushing comments survive
         | without getting downvoted. The team running HN work hard,
         | obviously, and there's some smart tech managing things like the
         | front page, but people are allowed to speak pretty freely on
         | here as long as they adhere to the rules. Punishing rule
         | violations is a little different from moderating speech too,
         | I'd argue.
        
           | ceejayoz wrote:
           | > Punishing rule violations is a little different from
           | moderating speech too, I'd argue.
           | 
           | That seems... hard to argue. The rules are often _about
           | speech_. HN, for example, has rules about being kind,
           | avoiding flamebait, not sneering, avoiding ideological
           | battle, accusations of astroturfing; the list of
           | _restrictions on speech_ is quite extensive.
           | https://news.ycombinator.com/newsguidelines.html
        
         | paulddraper wrote:
         | > Things are already bad enough in the country with
         | evangelicals dictating what books we can read and how we are
         | allowed to have sex.
         | 
         | You can't read child porn, or have sex with minors.
         | 
         | If an anyone is dictating something else to you, I'm unaware of
         | it.
        
       | kstrauser wrote:
       | A reminder that "social media" is not just a bunch of gigacorps.
       | I have a small Mastodon server that I host as a hobby. I've
       | collected $0.00 in gross revenue from it; not net, but gross. I
       | read the Florida law as best I could and didn't find any carve
       | outs for small, non-profit, personally run social media.
       | 
       | Well, nuts to that. I can and will censor whatever I and my users
       | decide we don't want to see. The people we censor are free to
       | download and install their own copies and make their own
       | moderation decisions based on their own community norms.
       | 
       | It's ridiculous that we're being held to the same standard as
       | Facebook and X. And if that means these laws shouldn't then apply
       | to Facebook and X, then so be it. I'm not willing to give up my
       | own 1st amendment rights to punish someone else.
        
       ___________________________________________________________________
       (page generated 2024-02-27 23:01 UTC)