[HN Gopher] Two Supreme Court cases that could break the internet
       ___________________________________________________________________
        
       Two Supreme Court cases that could break the internet
        
       Author : fortran77
       Score  : 62 points
       Date   : 2023-01-26 05:50 UTC (17 hours ago)
        
 (HTM) web link (www.newyorker.com)
 (TXT) w3m dump (www.newyorker.com)
        
       | LatteLazy wrote:
       | It continues to amaze me how many people, even here on HN, just
       | parrot various (factually incorrect) talking points from various
       | groups. Others pick a single desired outcome (less moderation)
       | and refuse to address any of the resulting outcomes (Nazi and
       | Terrorist propaganda on all major sites for instance).
        
         | x86_64Ubuntu wrote:
         | They aren't refusing to address it as much as they want that
         | outcome. The Right in the US is tired of being hamstrung with
         | fact-checkers when spreading dubious claims, so the only
         | remaining avenue is to remove moderation altogether. If sites
         | die because they become a Nazi and spam cesspool, then so be
         | it.
        
         | untech wrote:
         | I think the terms "Nazi", "terrorist" and "propaganda" won't
         | help you, if you want to have a complex view of the world.
        
           | LatteLazy wrote:
           | I really just mean actual self-described neo-nazis...
        
             | untech wrote:
             | Well, for the audience of Russian _highly regulated_ TV,
             | news sites and social networks, the invasion of Ukraine was
             | necessary because the poor neighbour was suffering under a
             | regime of "actual self-described Neo-Nazis". Russian
             | Ministry of Defence even reports its daily _results_ in
             | terms of "Neo-Nazis eliminated". My parents are consumers
             | of Russian media, and it is impossible to discuss the war
             | with them, because they talk and _think_ with these
             | flexible terms like "Nazi", "terrorist" and "propaganda".
             | And they are not unintelligent people.
             | 
             | I wish citizens of the free world would be more careful
             | with their freedoms. Democracy and autocracy is a spectrum,
             | and the slide towards dictatorship, from my experience,
             | could be gradual and subtle.
        
               | LatteLazy wrote:
               | I am a little confused by your point. I don't disagree, I
               | just don't see it as dictatorial for people to be allowed
               | to decide what happens in their communities. In fact, I
               | think they only way any community can actually function
               | is to have the right to exclude bad content and bad
               | actors. That's true for a church or a book club or hacker
               | news or facebook or anyone else.
        
               | untech wrote:
               | Well, I agree that moderation is useful for communities.
               | I disagree that moderation standards should be defined by
               | a regulator. On reddit, different communities have
               | different moderation policies. Same for various chat
               | groups. And there are resources with no moderation, like
               | anonymous boards. This is fine. If you don't like
               | moderation policy of your community, argue for a change
               | or create a fork. But don't involve the government in it.
        
               | LatteLazy wrote:
               | I don't think the regulator should define moderation
               | standards. I am not sure what gave you that idea?
               | 
               | That is not the current system OR the result of removing
               | s.230. Though removing s230 would mean courts were much
               | much more involved and moderations was much much heavier
               | (or non-existent al-la 4Chan etc).
        
       | 1vuio0pswjnm7 wrote:
       | Two cases that could break Big Tech's hold on the internet.
       | 
       | These cases are not going to stop anyone from publishing their
       | own content via internet, but they might affect Big Tech
       | intermediaries that publish only other people's content and
       | produce no content themselves.
       | 
       | These cases will not stop traditional media organisations that
       | employ journalists from fact-checking and publishing the content
       | they produce.
       | 
       | Even if these cases are decided favourably to the defendants,
       | "tech" companies will continue to publish low quality, low-brow,
       | incendiary, false, "clickbait" content. That is what gets
       | eyeballs. Section 230 is not incentivising them to moderate, as
       | it was intended. Advertising is incentivising them to allow
       | anything that garners more views and "engagement". Section 230 is
       | only providing them with immunity from all manner of litigation
       | around _any_ moderation. There is no incentive to fact-check
       | anything. Even  "recommending" terrorrist group videos is
       | apprently fair game.
       | 
       | If they have zero obligation to moderate and no incentive to
       | fact-check, because they have immunity under 230, i.e., Google
       | wins against Gonzalez, then they can grow their once modest
       | websites to gigantic, overpowering size, with a disproportionate
       | amount of influence on the open web.^1
       | 
       | Reframing the situation, the gigantic websites now call
       | themselves "platforms", sharecroppers and serfs are "publishers"
       | and "creators".
       | 
       | Grab a stake in the "Metaverse".
       | 
       | You cannot break something that is already broken.
       | 
       | 1. For Big Tech this web feudalism is the Holy Grail. Grow the
       | audience size and reap the advertising revenue. Obviously any
       | proposed change, an improved interpretation of Section 230, that
       | could threaten this status quo is unacceptable to them. But they
       | are not the majority of web users. Was the web created so that a
       | few websites could control the majority of traffic. In their
       | view, yes. That is the only way the web can operate. How
       | convenient.
        
         | 1vuio0pswjnm7 wrote:
         | It is relatively common to see HN thread discussions that
         | complain about some form of "censorship" by Big Tech, or, more
         | generally, about "free speech" issues (real or imagined) with
         | respect to Big Tech websites.
         | 
         | Perhaps it is worthwhile to ask ourselves what is it that
         | allows such "censorhip" or threats to "free seech" to happen. A
         | few ideas:
         | 
         | 1. Big Tech's position as intermediaries, aka "middlemen". Web
         | users and to some degree internet users (cf. "tech" workers)
         | are not expected to communicate with each directly without
         | using Big Tech to do it. All web traffic must be observable by
         | Bog Tech in ways that allow Big Tech to sells its "online
         | advertising services". End result: "Surveillance capitalism".
         | 
         | 2. Section 230. Without this immunity from litigation, arguably
         | Big Tech could not "censor" successfully. If they could do so,
         | then why would they oppose any changes to Section 230
         | interpetation. Of course, neither "censorship" nor "free
         | speech" is really the correct word to describe the activity Big
         | Tech is engaging in that draws controversy. Big Tech promotes
         | certain content. It does not matter how they do it, whether
         | they measure "popularity" or whether they accept payment for
         | placement of ads that look like search results, or "targeted"
         | ads or whatever.
         | 
         | In Gonzalez, the facts are around "recommendations".
         | Unsolicited promotion of certain content to certain users. This
         | is done not at the request of the user, it is not optional.
         | It's done for Google's benefit.
         | 
         | These companies that profit from surveillance and associated
         | advertising are not "neutral". They are not "dumb pipes". Given
         | that they are under no contractual obligations to any web user,
         | only a fool would claim they are "critical infrastructure".
         | They are surveillance, data collection and advertising
         | companies.
         | 
         | Does Big Tech need Section 230 protection in order to continue
         | to pull this off.
         | 
         | When Section 230 was enacted, there were no "tech" companies.
         | Online service providers such as Compuserve, Prodigy and for a
         | while AOL all charged fees. There were contracts between these
         | companies and their users. There was customer service. Users
         | were not ad targets, they were not "the product", they were
         | customers. Users could sue these companies for not performing
         | the job that they were being paid to do. Then came Section 230.
         | 
         | People always envisioned that the internet would be used for
         | commerce. However today "tech" companies and most importantly
         | "Big Tech" try to make _all_ web activity commercial.
         | Everything anyone does using the internet is surveilled with an
         | eye toward its commercial significance, if any.
         | 
         | And that practice is itself, the surveillance of _all_ internet
         | activity, even academic or recreational use, is considered a
         | "business model". Users pay nothing. Why would anyone pay to be
         | surveilled and served with targeted ads. Advertisers pay for
         | the surveillance.
         | 
         | Don't be evil.
         | 
         | Everyone anticipated selling widgets^FN1 over the internet. No
         | one envisioned "tech" companies. Websites with no products for
         | sale that conduct surveillance and collect data about web users
         | as a "business model" and, unsuprisingly, in most cases, cannot
         | turn a profit. No one envisioned a few gigantic websites with
         | billions of pages doing this that manage engorge themselves
         | with advertiser spend and take over the web.
         | 
         | What (who) is Section 230 really for.
         | 
         | If centralisation and single points of failure are undesirable,
         | and Big Tech represents centralisation and gatekeeper
         | chokepoints that can and do "fail", then perhaps it's worth
         | considering that Section 230 protection for Big Tech is what
         | allows this status quo to continue and for the situation to
         | worsen.
         | 
         | FN1. In Bezos' case, books.
        
         | 1vuio0pswjnm7 wrote:
         | https://en.wikipedia.org/wiki/Zeran_v._America_Online,_Inc.
         | 
         | This was neither illegal nor obscene content. However it had a
         | severe effect on a third party. Arguably, not as severe as the
         | effect that the terrorist group videos that Google recommended
         | had on the Gonzalez family.
        
         | nindalf wrote:
         | Remember that most content written by most people actually
         | happens in these platforms owned and moderated by someone else.
         | You say it won't stop anyone but how many people currently
         | posting on Twitter, Facebook, YouTube and TikTok are capable of
         | creating their own websites, posting the content there, paying
         | for the cost of hardware and bandwidth and also finding an
         | audience somehow? 0.1%? 0.01%?
         | 
         | The rest of your comment is equally wrong (perhaps even low
         | quality, low brow, incendiary, false), but I have little time
         | to fact check it.
        
           | joeyh wrote:
           | They would hop on a p2p or federated social network in a
           | heartbeat if it were the only option.
           | 
           | The fact that the CDA and 230 have tilted the internet away
           | from those, by propping up currently large businesses does
           | not mean that's the only way the internet can work.
        
             | [deleted]
        
         | lliamander wrote:
         | > Section 230 is not incentivising them to fact-check and
         | moderate, as it was intended.
         | 
         | I think you misunderstand the intent of section 230. The vision
         | of the bill was to support an internet of relatively free an
         | unconstrained speech. No one expected these platforms to "fact-
         | check" anything, and the moderation was not rule on who was
         | right or wrong, but to mainly give them leeway to keep out
         | genuinely illegal (and to a lesser extent, obscene) content.
        
           | tzs wrote:
           | > The vision of the bill was to support an internet of
           | relatively free an unconstrained speech. No one expected
           | these platforms to "fact-check" anything, and the moderation
           | was not rule on who was right or wrong, but to mainly give
           | them leeway to keep out genuinely illegal (and to a lesser
           | extent, obscene) content.
           | 
           | How did you reach that conclusion?
        
       | i_dont_know_ wrote:
       | The whole point of Section 230 was that we wanted to
       | simultaneously incentivize user-content platforms (so, services
       | that just allowed people to freely express themselves without
       | concerning itself too much about the content) while
       | simultaneously encouraging those services to take down illegal
       | content. That was it.
       | 
       | The whole reason it was needed was because these platforms were
       | worried that if they took down content, it might be seen as
       | implicitly endorsing the content that remained. If that was the
       | case, then they'd be liable for the content that remained.
       | Including things like libel laws... it would be the same as a
       | newspaper printing that content.
       | 
       | They didn't want that, so this was the compromise. You can do
       | light moderation and we'll grant you exemption.
       | 
       | Fast-forward to today. Facebook and Twitter do not do "light
       | moderation". They decide what you see and what you don't... they
       | direct your attention from smaller stories to bigger ones, they
       | spend countless thousands of employee hours catering to the whole
       | experience. If that doesn't count as a modern digital scaled form
       | of editorialization, I don't know what would.
       | 
       | I think a law designed to encourage very light moderation in no
       | way applies to full-fledged algorithmic determination, and that
       | distinction needs to be made and clarified.
        
         | LatteLazy wrote:
         | Services could always take down illegal content (in fact they
         | were legally required to).
         | 
         | Section 230 is specifically for taking down legal content that
         | the site controller wants gone (without then having endorsed
         | what remains as you say).
         | 
         | Section 230 was specifically to allow MORE than what you call
         | "light moderation". That was always the point.
        
         | lliamander wrote:
         | You precisely describe the nature of section 230.
         | 
         | > I think a law designed to encourage very light moderation in
         | no way applies to full-fledged algorithmic determination, and
         | that distinction needs to be made and clarified.
         | 
         | I think this is where the definition of "good faith" in section
         | 230 comes into play. If the platforms are simply removing
         | illegal content and protecting the users from content they
         | don't want to see, that would be good faith in the sense it is
         | putting the interests of the users first.
         | 
         | Now, optimizing feeds to benefit advertisers, or trying to
         | socially engineer democratic elections? That is not good faith.
         | 
         | And yet, some of these actions are done _at the request of_
         | politicians and government agencies - often with the implicit
         | threat of regulation and anti-trust action.
         | 
         | As much as we need to expect good faith moderation from social
         | media platforms, we also a stronger protections for private
         | entities being strong-armed by state actors.
        
           | qball wrote:
           | >we also a stronger protections for private entities being
           | strong-armed by state actors.
           | 
           | It's important to note that this also needs (though I suspect
           | that it already does) to apply to hosting providers and ISPs.
           | (We need one for banks and payment processors, too, but one
           | step at a time.)
           | 
           | The whole "just make your own Internet" will ultimately be
           | the death of free thought; in some respects, this has already
           | happened. Cloudflare in particular makes a bundle on flat out
           | illegal content; forcing them to moderate everything or just
           | accept everyone's business (and ensuring ISPs can't blackhole
           | routing requests) would likely be an improvement, and not one
           | the enemy can as easily influence (as there's no viable "our
           | payment processors said so" excuse).
           | 
           | Of course, then the enemy will just amp up their efforts
           | through the banks or the app stores, but one less avenue of
           | attack they have is always better.
        
           | edmundsauto wrote:
           | Perhaps I missed it, but what is the evidence for socially
           | engineering elections? Cambridge Analytica was actually a
           | violation of FB. I have seen emails where they discussed
           | removing content or accounts due to policy violations, but
           | that doesn't sound like what you mean. While the employee
           | base is generally left leaning, there are plenty of studies
           | showing that platforms actually favored conservative voices.
           | 
           | Can you elaborate with details of what you meant?
        
             | philipwhiuk wrote:
             | > Cambridge Analytica was actually a violation of FB.
             | 
             | Of their terms of service, less so the actual law.
        
               | disgruntledphd2 wrote:
               | Look, that stuff about the Big 5 and microtargeting just
               | didn't work. How Cambridge analytical actually made money
               | was sting operations on politicians featuring hookers and
               | hotel rooms.
        
           | danShumway wrote:
           | > You precisely describe the nature of section 230
           | 
           | In the original authors' own words
           | (https://www.wyden.senate.gov/news/press-releases/sen-
           | wyden-a...):
           | 
           | > Section 230 protects targeted recommendations to the same
           | extent that it protects other forms of content presentation.
           | [...] That interpretation enables Section 230 to fulfill
           | Congress's purpose of encouraging innovation in content
           | presentation and moderation. The real-time transmission of
           | user-generated content that Section 230 fosters has become a
           | backbone of online activity, relied upon by innumerable
           | Internet users and platforms alike. Section 230's protection
           | remains as essential today as it was when the provision was
           | enacted.
           | 
           | The original authors believe that algorithms are protected.
           | 
           | The original authors of Section 230 also don't believe that
           | Section 230 prohibits biased platforms or prohibits platforms
           | from having an agenda. Ron Wyden's take on the "platforms are
           | biased" argument
           | (https://www.vox.com/recode/2019/5/16/18626779/ron-wyden-
           | sect...):
           | 
           | > Section 230 is not about neutrality. Period. Full stop.
           | 
           | ----
           | 
           | That being said, should we have better protections for
           | platforms being strong-armed by state actors? Yes,
           | absolutely. The government has many levers it can pull to
           | influence private speech, and those levers need careful
           | safeguards and we need checks in place to prevent state
           | actors from threatening platforms and using political power
           | to bully them into making specific moderation decisions.
           | 
           | But while that is an admirable goal, it has nothing to do
           | with Section 230, a law that is itself a check on government
           | power to punish companies over moderation decisions.
           | 
           | If you're worried about private entities being influenced by
           | state actors -- as you say "often with the implicit threat of
           | regulation and anti-trust action" -- then giving the
           | government more power to determine what is and isn't "good
           | faith" moderation is heading in the wrong direction and would
           | make the problem even worse.
           | 
           | The solution to the government strong-arming platforms into
           | removing content is not to give the government more power
           | over moderation decisions.
        
         | asah wrote:
         | serious q: what exactly is "light moderation" and how is this
         | not itself an algorithm, just being executed by wetware neural
         | nets with all sorts of biases, inconsistent judgments, etc ?
         | 
         | if "light moderation" means an escalation path for exceptions,
         | then the major platforms all have this.
        
           | mjevans wrote:
           | When content is offered, not based on a factual request from
           | the user but as a 'recommendation' that is a decision (even
           | if it's an algorithm) and NOT a simple fact of serving
           | whatever matched the end user requested (not the provider
           | with 230 immunity) filter / sorting order on other end user
           | published (not the provider with 230 immunity) content.
        
         | matthewdgreen wrote:
         | Let me make a counterargument:
         | 
         | If you're a government that wants to have substantial control
         | over what sorts of content is available to users, then Section
         | 230 is a problem for you. What you _don't_ want is a hundred
         | million people responsible for their own posts. What you _do_
         | want is a few dozen services that run the Internet and who you
         | can say are "editorializing" and who thus face all kind of
         | liability unless they obey specific speech codes that
         | politicians then set. You might think that government control
         | over platforms' speech is prevented by the First Amendment, but
         | that's the insidious nature of Section 230: if you passed laws
         | controlling how newspapers editorialize, the laws would be
         | struck as violating the First Amendment. If instead you pass
         | laws modifying platforms' civil liability for users' speech, it
         | might have the same overall effect as the explicit laws would
         | have on newspapers. Yet some will argue that it's
         | constitutional to do so.
         | 
         | But surely, you argue, you're not opposed to letting firms do
         | light moderation. They just can't do _heavy_ moderation, like
         | kick off users for "political" reasons or use any algorithms to
         | help users discover content. But nobody has a clue what
         | "political" means here and how it differs from spam, and coming
         | up with a definition is not the job of the US government or the
         | courts. Similarly, does anyone seriously think that firms are
         | going to stop using content discovery algorithms and become a
         | pure "common carrier" of all content? Of course they won't.
         | They'll just accept whatever speech codes the government
         | develops and they'll obey them so that they can continue to
         | make money.
         | 
         | It's depressing to see people on HN walk willingly into a
         | speech-regulation regime for private companies, while claiming
         | that the reason they want this is to ban algorithmic discovery.
         | If you want to ban algorithms, just pass a law doing so and see
         | if it's constitutional. Instead we get this end-run where the
         | result of these reforms is very likely to be much worse than
         | the current situation _and more critically we will still have
         | most of the algorithms and moderation_ used to justify it.
        
         | danShumway wrote:
         | > You can do light moderation and we'll grant you exemption.
         | 
         | Another article about Section 230, another top-rated post on HN
         | that's wrong about its origin. Section 230 was always designed
         | from the start to allow companies to moderate _legal_ content.
         | "Light" moderation was never part of the equation.
         | 
         | Look, the people who literally wrote and sponsored Section 230
         | are still alive today and they are open about what their
         | motivations were, and yet still every single time this subject
         | comes up the top-rated comment on HN is some completely
         | fictional notion about how Section 230 was based on an
         | assumption that feeds wouldn't be algorithmic or wouldn't
         | censor anybody.
         | 
         | And it's just factually wrong, and I don't understand why it's
         | impossible to correct. It's difficult to have a conversation
         | about the political/social _merits_ of a law when people don 't
         | even understand the basic factual information about why the law
         | exists.
         | 
         | Imagine if every time an article about Linux came up the top-
         | rated HN post was someone saying that Linux was never really
         | designed to be a desktop OS, it was always intended to just be
         | used on servers and only on servers -- and it didn't matter if
         | Linus Torvalds himself was out making the rounds correcting
         | people on that, we still had to have this conversation every
         | time Linux was mentioned. That's how HN discusses Section 230.
         | 
         | Section 230 was never about preventing algorithms. It was not
         | designed to prevent "heavy" moderation. Again, the people who
         | wrote the law are alive today and have explained their
         | motivations.
         | 
         | ----
         | 
         |  _Edit: it 's been pointed out (correctly) that I should
         | probably throw some sources on this. Techdirt's article is a
         | little snarkier than I like, but is generally good
         | (https://www.techdirt.com/2020/06/23/hello-youve-been-
         | referre...), and even the Wikipedia article on Section 230 is a
         | decent place to start looking at 230 motivations
         | (https://en.wikipedia.org/wiki/Section_230). There's also a
         | fairly decent book, "The 26 words that made the modern
         | Internet."_
         | 
         |  _In regards to the parent comment, from Wikipedia:_
         | 
         |  _> Service providers made their Congresspersons aware of these
         | cases, believing that if followed by other courts across the
         | nation, the cases would stifle the growth of the Internet.
         | United States Representative Christopher Cox (R-CA) had read an
         | article about the two cases and felt the decisions were
         | backwards.  "It struck me that if that rule was going to take
         | hold then the internet would become the Wild West and nobody
         | would have any incentive to keep the internet civil," Cox
         | stated._
         | 
         |  _Cox 's concern was not that platforms should only do "light"
         | moderation, he wanted a way for platforms to be able to also
         | moderate completely legal but "uncivil" content that would make
         | it hard for platforms to be professional or organized; a "Wild
         | West"_
         | 
         |  _Ron Wyden goes a step further in his recent interviews
         | (https://www.vox.com/recode/2020/8/5/21339766/zuckerberg-
         | priv...):_
         | 
         |  _> "There is not a single word -- not a word, not a comma, not
         | a parenthesis, nothing -- that suggests that 230 was about
         | neutrality. In fact, what we designed it to be is if you had a
         | conservative website, good for you! Do what you want with it!
         | If you had a progressive website, same thing."_
         | 
         |  _And if that 's not convincing to you, consider that both Ron
         | Wyden and Christopher Cox have filed an amicus brief on this
         | very case, saying that they believe Section 230 should protect
         | Google (https://www.wyden.senate.gov/news/press-releases/sen-
         | wyden-a...):_
         | 
         |  _> The co-authors reminded the court that internet companies
         | were already recommending content to users when the law went
         | into effect in 1996, and that algorithms are just as important
         | for removing undesirable posts as suggesting content users
         | might want to see._
         | 
         |  _Really not much that can make it more clear than that, the
         | assertion that Section 230 wasn 't designed to protect
         | algorithmic recommendations is factually wrong, this is not
         | something that's a matter of opinion._
        
           | jfengel wrote:
           | I find this to be a common pattern. "Moderation should be
           | there to moderate against the things that concern me, but not
           | against anything I might want to do or see."
           | 
           | I interpret "light moderation" to mean "I disapprove of spam
           | and child porn, but I'm not subject to racism or sexual
           | harassment, and I don't personally know anybody harmed by my
           | conspiracy theories."
        
           | [deleted]
        
           | d23 wrote:
           | Thanks for this response. One thing that would make it even
           | more compelling is citing sources.
        
             | danShumway wrote:
             | Good point, thanks for asking about references.
             | 
             | I've updated the post above with links to a couple of
             | decent resources and with quotes from the original authors
             | of Section 230, as well as with information about the
             | recent amicus brief they filed to the Supreme Court where
             | they explicitly argue that Section 230 should protect
             | algorithmic recommendations.
        
               | d23 wrote:
               | Great stuff, thanks.
        
           | catiopatio wrote:
           | Your post is itself dishonest by (gross) omission -- you
           | don't even mention the Communications Decency Act, the
           | legislation that introduced Section 230.
           | 
           | "The Communications Decency Act of 1996 (CDA) was the United
           | States Congress's first notable attempt to regulate
           | pornographic material on the Internet. In the 1997 landmark
           | case Reno v. ACLU, the United States Supreme Court
           | unanimously struck the act's anti-indecency provisions."
           | 
           | ... leaving behind the indemnification provided by Section
           | 230 intended to ensure that hosting providers were not liable
           | for user content under the CDA.
           | 
           | That's it. Nothing more complicated.
        
             | danShumway wrote:
             | > ... leaving behind the indemnification provided by
             | Section 230 intended to ensure that hosting providers were
             | not liable for user content under the CDA.
             | 
             | > That's it. Nothing more complicated.
             | 
             | What on earth about those two sentences suggests that
             | algorithmic content wouldn't be protected or that Section
             | 230 would require platforms to be neutral?
             | 
             | Again, and I can't stress this enough: the people who
             | _wrote_ Section 230 disagree with you. You are arguing with
             | people who wrote Section 230 that they are wrong about
             | their motivations about why they wrote it.
             | 
             | Why is this so hard for people? What part of Ron Wyden
             | literally saying "Section 230 is not about neutrality" are
             | people getting hung up on? What wiggle room exists in that
             | quote? When you have the actual authors telling you that
             | they wanted platforms to be free to be Conservative/Liberal
             | biased, and you have the actual authors writing briefs in
             | support of Google for the Supreme Court, where on earth are
             | you getting the idea that Section 230 is just about
             | pornography or illegal content, or that recommendation
             | algorithms wouldn't be protected?
        
               | catiopatio wrote:
               | What Wyden happens to claim _now_ about the CDA and
               | Section 230 is irrelevant.
               | 
               | Were you alive and on the internet when the CDA
               | (including section 230) was passed?
               | 
               | I was; the public debate over the CDA and justifications
               | for the Section 230 carve-out bear very little
               | resemblance to what Wyden claims now.
        
               | danShumway wrote:
               | > What Wyden happens to claim now about the CDA and
               | Section 230 is irrelevant.
               | 
               | With respect:
               | 
               | A) no it's not, or you wouldn't be on here arguing about
               | intentions. Nobody who actually believes in Death of the
               | Author wastes their time arguing about why a thing was
               | originally made. If people were really embracing Death of
               | the Author in regards to Section 230, we would just be
               | talking about its merits, and that would be great! But
               | instead we're doing this.
               | 
               | B) it's unbelievably silly to ask people to trust your
               | memory as a random internet commenter over both the
               | original authors and over the many academic articles and
               | legal decisions that have been written about Section 230.
               | I guess every single legal precedent about Section 230 is
               | also wrong about its intentions?
               | 
               | C) recommendation algorithms existed when Section 230 was
               | written. They are not a new thing that Congress didn't
               | know about in 1996.
               | 
               | D) biased platforms existed when Section 230 was written.
               | They are not a new thing that Congress didn't know about
               | in 1996.
               | 
               | E) without revealing too much about my age, I was alive
               | in 1996, and while I wasn't particularly involved in the
               | Internet at that point, I've been involved on the
               | Internet long enough to have seen how the conversation on
               | Section 230 has evolved, and "evolved" the right word to
               | use. I think the "algorithms aren't protected" argument
               | is a fairly recent invention. The critiques people are
               | making today are different from the critiques they used
               | to make online.
               | 
               | ----
               | 
               | And as respectfully as it is possible for me to phrase
               | this:
               | 
               | Whatever your personal memory is about your motivations
               | or the motivations of the people around you when Section
               | 230 was proposed -- and I am not doubting you that maybe
               | you did have a certain view of Section 230 back then, or
               | maybe you were seeing reporting that phrased the
               | provision in a certain light -- but that perspective is
               | irrelevant to what the law actually says and the way it
               | has been interpreted ever since it came out, and it is so
               | unbelievable that Section 230's origins are so well-
               | documented and there are still people on HN saying,
               | "well, but those sources are all wrong, _I_ remember what
               | was going on at the time. "
               | 
               | Maybe you did think of Section 230 a certain way back
               | then. So? I'm supposed to trust you over the original
               | authors?
               | 
               | In what other topic do we tolerate this kind of logic on
               | HN? We seriously are now arguing that the bypartisan
               | politicians who wrote Section 230 are... lying about what
               | they meant? That's where we are right now?
               | 
               | At the very least, you'd better have some sources to back
               | up what you're saying. The "algorithms are different"
               | argument is very recent and from what I've seen doesn't
               | really have any evidence for it suggesting that it is
               | anything more than a fantasy about the law's origins that
               | 230 critics would like to believe is true.
               | 
               | ----
               | 
               |  _Edit: Okay, actually, let 's just nip this in the bud
               | rather than argue back and forth. Here's how courts were
               | talking about Section 230 in 1998, only 2 years after it
               | was passed (https://cyber.harvard.edu/property00/jurisdic
               | tion/blumenthal...):_
               | 
               |  _> If it were writing on a clean slate, this Court would
               | agree with plaintiffs. AOL has certain editorial rights
               | with respect to the content provided by Drudge and
               | disseminated by AOL, including the right to require
               | changes in content and to remove it; and it has
               | affirmatively promoted Drudge as a new source of
               | unverified instant gossip on AOL. Yet it takes no
               | responsibility for any damage he may cause. AOL is not a
               | passive conduit like the telephone company, a common
               | carrier with no control and therefore no responsibility
               | for what is said over the telephone wires. n11 Because it
               | has the right to exercise editorial control over those
               | with whom it contracts and whose words it disseminates,
               | it would seem only fair to hold AOL to the liability
               | standards applied to a publisher or, at least, like a
               | book store owner or library, to the liability standards
               | applied to a distributor. n12 [52] But Congress has made
               | a different policy choice by providing immunity even
               | where the interactive service provider has an active,
               | even aggressive role in making available content prepared
               | by others. In some sort of tacit quid pro quo arrangement
               | with the service provider community, Congress has
               | conferred immunity from tort liability as an incentive to
               | Internet service providers to self-police the Internet
               | for obscenity and other offensive material, even where
               | the self-policing is unsuccessful or not even attempted._
               | 
               |  _So in summary, in 1998 courts were saying "yeah, it
               | looks like this is a website making an explicit
               | recommendation because it's actively promoting content
               | instead of acting like a neutral telephone network, and
               | we actually personally don't think that should be
               | protected, but very clearly Congress has protected it and
               | Section 230 applies to it." Their exact words are: "even
               | where the interactive service provider has an active,
               | even aggressive role in making available content prepared
               | by others."_
               | 
               |  _But sure, Section 230 was only about getting rid of
               | porn and had nothing to do with algorithms. /s The
               | reality is that even all the way back in 1998, courts
               | already recognized that Section 230 applied to "promoted"
               | content and was not qualified on an assumption that
               | platforms would act like a common carrier. Even early on,
               | courts already recognized that Section 230 protected more
               | content than just pornography or indecency._
        
               | [deleted]
        
       | kart23 wrote:
       | comments here are a great example of the hn bubble. killing
       | youtube, instagram or facebook would be a massive net negative
       | for society right now. the amount of money and people that depend
       | on these platforms far outweigh the concerns of people
       | complaining against corporate censorship.
        
         | em-bee wrote:
         | how is that? it won't be more than a disruption. for each of
         | these services alternatives exist, and just like mastodon has
         | seen more users since the recent twitter upheaval, other
         | services, existing or new, will quickly fill in any gap left by
         | the demise of an existing service.
        
       | chmod600 wrote:
       | "This is the law colloquially known as Section 230, which is
       | probably the most misunderstood, misreported, and hated law on
       | the Internet."
       | 
       | That by itself is a problem. For a law like this to survive,
       | someone needs to explain to voters what problems it solves, the
       | way companies take advatage of it, why that's hard to fix, and
       | why it's still worth having.
        
       | peyton wrote:
       | > these terrorism cases are about wanting platforms to take down
       | more content, to step in more to prevent bad things from
       | happening. And I think everybody sympathizes with that goal,
       | whatever we think is the right set of rules to achieve it without
       | a bunch of collateral damage.
       | 
       | Interviewee is a bit biased. I don't think everyone sympathizes
       | with the goal of taking down more content.
        
         | threeseed wrote:
         | 4chan and early Reddit are classic examples of this.
         | 
         | There are people for whom abhorrent content is what they prefer
         | and will defend.
         | 
         | And they want this content on mainstream sites so they can feel
         | like they are normal.
        
           | throwawaylinux wrote:
           | Yes that's true. And there are others who believe governments
           | are the peddlers and enablers of the most abhorrent crimes,
           | from slavery to arms trafficking to war and assassination and
           | "intervention" to human experimentation, and should
           | absolutely be fought at every step of the way in their
           | neverending thirst for more power and control, particularly
           | where private speech is concerned.
           | 
           | The former group sadly does make a great strawman for people
           | who are emotionally hysterical and don't care to think very
           | deeply about the issue though.
        
             | EMIRELADERO wrote:
             | > And there are others who believe governments are the
             | peddlers and enablers of the most abhorrent crimes, from
             | slavery to arms trafficking to war and assassination and
             | "intervention" to human experimentation, and should
             | absolutely be fought at every step of the way in their
             | neverending thirst for more power and control, particularly
             | where private speech is concerned.
             | 
             | Which is an (in my opinion) very inmature position to take.
             | Look at the governments of western Europe, particularily
             | Scandinavia
        
               | throwawaylinux wrote:
               | Oh, well I looked at them and in my opinion it is _not_ a
               | very immature position to take. Solid debate.
        
               | [deleted]
        
           | ever1337 wrote:
           | 4chan and reddit are both mainstream sites
        
           | jterrys wrote:
           | >And they want this content on mainstream sites so they can
           | feel like they are normal.
           | 
           | I think they want this content on mainstream sites because
           | they know it pisses off people and makes them laugh
        
         | Sakos wrote:
         | It's the same manipulative argument you see with "protecting
         | the children". Everybody agrees with the goal, so obviously
         | we're doing the right thing by curtailing everybody's freedoms.
        
           | forgetfreeman wrote:
           | How about instead of "curtailing freedoms" we attach some
           | really aggressive consequences for poor choices as regards
           | broadcasting obvious bullshit in public spaces? We good now?
        
             | Sakos wrote:
             | I'm absolutely fine with not necessarily being able to say
             | anything and everything. I'm generally supportive of anti-
             | hate speech laws and I'm not a free speech absolutist.
             | However, just because I might agree with their goal doesn't
             | mean I agree with how they choose to achieve it and I'm
             | skeptical of the above approach.
        
             | Idk__Throwaway wrote:
             | You support this until you and the arbiter of "obvious
             | bullshit" begin to disagree.
        
             | perryizgr8 wrote:
             | How about we treat freedom of speech like the inalienable
             | right it is? We better now?
        
       | vuln wrote:
       | No worries I'm already dead from lack of Net Neutrality. /s
        
       | weitzj wrote:
       | So maybe this will lead to more p2p Internet services like IPFS.
       | And maybe for Google and such you have to use their p2p version
       | probably baked into the chrome browser with an authentication
       | system.
       | 
       | Something like the old AOL dialup but decentralized like ipfs and
       | per website or with some kind of audit trail.
        
       | Dalewyn wrote:
       | I'm not about to read the article or concern myself with the
       | minute details of the cases, but I will say I hope SCOTUS throws
       | out section 230 protections for the likes of Google, Twitter,
       | Facebook, Reddit, Hacker News, et al.
       | 
       | If you engage in editorializing (read: controlling) content
       | beyond what is necessary to comply with laws and regulations (eg:
       | no child porn, no tangible death threats), then you should be
       | liable for the content you subsequently publish.
       | 
       | Paper media abides by this, there is no reason why internet-based
       | media cannot abide by the same.
        
         | jterrys wrote:
         | I think that's a pretty sledgehammer-y view.
         | 
         | Like, I make a website for pregnant women and someone comes and
         | rants about American politics. What the fuck am I supposed to
         | do if I can't delete their comments? Let my project dwindle to
         | shit because I'm "editorializing" content by deleting this
         | obvious off-topic discussion?
         | 
         | Am I now responsible for the opinions of other people because,
         | according to your logic, I now endorse them by not deleting
         | their messages? You've basically reduced all forms of discourse
         | by introducing liability into an open discussion. Someone gives
         | incorrect advice that someone else takes on my forum. They kill
         | their baby. Whops, time for me to go to fucking jail for
         | someone else's opinion.
         | 
         | I think you have a problem with platforms and monopolistic
         | behavior that's designed to colonize your attention.
         | 
         | The fact that searches on the internet are dominated by one
         | private for-profit entity is a problem. The fact that the
         | majority of internet traffic gets siloed into a few websites,
         | which aggressively (and honestly sometimes borderline illegally
         | and definitely morally reprehensibly) snuff out competition is
         | a problem. The fact that registering a domain, setting up an
         | internet connection, and trying to host a website, are all for
         | profit, private driven workflows is a problem. The fact that
         | all of the above can easily be taken away from you, due to a
         | denial of service, which has nothing to do with the legality of
         | the content you host or your opinions, is a problem. All of
         | these are serious problems that require more than a band-aid,
         | sledgehammer solution aimed at section 230.
        
           | Dalewyn wrote:
           | >I make a website for pregnant women and someone comes and
           | rants about American politics. What the fuck am I supposed to
           | do if I can't delete their comments? Let my project dwindle
           | to shit because I'm "editorializing" content by deleting this
           | obvious off-topic discussion?
           | 
           | Yes; do nothing and assume no liablity. If you don't want to
           | assume any liability you don't involve yourself beyond
           | providing a platform. Simple as that.
           | 
           | >I now endorse them by not deleting their messages?
           | 
           | You endorse nothing because you assume no liability by doing
           | nothing (abiding by applicable laws notwithstanding).
           | 
           | >Someone gives incorrect advice that someone else takes on my
           | forum. They kill their baby. Whops, time for me to go to
           | fucking jail for someone else's opinion.
           | 
           | You are not liable for the "incorrect advice" if you did not
           | involve yourself in its creation and publication, beyond
           | providing a platform.
           | 
           | >I think you have a problem with platforms and monopolistic
           | behavior that's designed to colonize your attention.
           | 
           | I have a problem with publishers pretending to be platforms
           | and enjoying the benefits of being both a publisher and a
           | platform while evading all liabilities thereof.
           | 
           | >The fact that searches on the internet are dominated by one
           | private for-profit entity is a problem.
           | 
           | It is, but it is neither here nor there.
           | 
           | >The fact that the majority of internet traffic gets siloed
           | into a few websites, which aggressively (and honestly
           | sometimes borderline illegally and definitely morally
           | reprehensibly) snuff out competition is a problem.
           | 
           | It is, and the removal of section 230 protections would serve
           | to partially alleviate the problem.
           | 
           | >The fact that registering a domain, setting up an internet
           | connection, and trying to host a website, are all for profit,
           | private driven workflows is a problem.
           | 
           | Hosting a website /should/ be a private endeavour for the
           | sake of ensuring free speech. Once upon a time, making your
           | own website (and assuming all the liabilities that entailed)
           | was how you voiced your thoughts on the internet instead of
           | posts on Twitter and Reddit.
           | 
           | >The fact that all of the above can easily be taken away from
           | you, due to a denial of service, which has nothing to do with
           | the legality of the content you host or your opinions, is a
           | problem.
           | 
           | That is a structural problem with how the internet is formed.
           | Tangientially related it may be, it is not directly relevant
           | to the question of free speech.
           | 
           | >All of these are serious problems that require more than a
           | band-aid, sledgehammer solution aimed at section 230.
           | 
           | Removing section 230 would only address some of the problems
           | the internet faces, but it would be a start.
        
             | tzs wrote:
             | So if I run a chess forum and delete posts that are not
             | about chess, and I delete 10 legal but off topic posts
             | about baby care that give advice that will kill some
             | babies, but I happen to miss one of those baby care posts
             | and some babies die, I'm liable because I was actively
             | deleting some legal posts?
             | 
             | But if I don't delete posts (other than those I'm legally
             | required to delete), so let all the baby care posts remain
             | I'm free from liability?
             | 
             | So the net result is that my chess forum (and all other
             | forums run by entities that don't have the resources to
             | review every post before allowing it to go public) becomes
             | a topic-free general forum.
             | 
             | And you think this is a _good_ thing!?
        
             | jterrys wrote:
             | >Yes; do nothing and assume no liablity. If you don't want
             | to assume any liability you don't involve yourself beyond
             | providing a platform. Simple as that.
             | 
             | So your solution is to actually do nothing and let the
             | project die? What an awful non-solution.
        
               | less_less wrote:
               | Worse than that: their "solution" is that all platforms
               | should be 8chan.
        
         | spaced-out wrote:
         | >If you engage in editorializing (read: controlling) content
         | beyond what is necessary to comply with laws and regulations
         | (eg: no child porn, no tangible death threats), then you should
         | be liable for the content you subsequently publish.
         | 
         | So how do you propose forums deal with spam?
        
           | Dalewyn wrote:
           | You don't.
           | 
           | Seriously.
           | 
           | That is the price for avoiding liability: To be able to say
           | you had nothing to do with it beyond providing a platform and
           | abiding other applicable laws.
        
             | [deleted]
        
             | nickthegreek wrote:
             | So I have to allow my resources to be clogged up by others
             | who have no skin in the game? It is my harddrive, I get to
             | choose the ones and zeros on my private property. You do
             | not get to fill them up with spam that I have to maintain
             | forever.
        
               | Dalewyn wrote:
               | You're welcome to control what goes on your hard drive,
               | but the price is (or at least should be) you're assuming
               | liability.
        
           | throwawaylinux wrote:
           | Some reasonable terms of service might be required. Punishing
           | political opinions or criticisms of corporations or
           | governments, allowing government agencies to spread
           | misinformation and censor people, etc., is certainly not
           | necessary for dealing with spam though.
        
             | Gh0stRAT wrote:
             | ...and what happens when spammers realize they can mix in a
             | political opinion with their Canadian pharmacy spam in
             | order to gain immunity from the filters?
             | 
             | Prepare for a world of spam like: "Your car's extended
             | warranty is about to expire unless you act NOW. Press 1 to
             | connect to a specialist, or press 2 to acknowledge that the
             | FCC should be dismantled."
             | 
             | All of a sudden, removing spam is stifling political speech
             | and is forbidden.
        
               | throwawaylinux wrote:
               | Still looks like spam to me. Arguing minutiae about what
               | exactly spam looks like or the technical details of how
               | it would get filtered is putting the cart before the
               | horse though, and pretty pointless to get bogged into
               | here.
               | 
               | First you agree on the principles, and then you can talk
               | about how they might be implemented and regulated. Do you
               | agree in general that political opinion should not be
               | punished or censored? If not you can just say that, no
               | need to come up with increasingly byzantine ways that it
               | couldn't be made to work.
        
               | Gh0stRAT wrote:
               | I think people should be free to make spaces where they
               | exclude some things as off-topic, flamebait, trolling,
               | etc. eg If I want to run a forum about woodworking then I
               | should be free to remove posts about abortion, guns, CRT,
               | or whatever political hot button issue people might post
               | about.
               | 
               | In principal, I think the 1st amendment protects us from
               | the government censoring speech, not from private parties
               | doing so.
        
               | Dalewyn wrote:
               | >people should be free to make spaces where they exclude
               | some things
               | 
               | They are. The problem is such people want that freedom
               | with none of the liability.
        
               | UncleMeat wrote:
               | If it still looks like spam to you then we are right back
               | where we started, with people disagreeing about what
               | counts as essential speech to protect and what doesn't.
        
         | jrsj wrote:
         | Problem is there won't be any sites which actually refrain from
         | editorializing because they'll all be open to lawsuits over
         | content on their site. That would be bad enough on its own, but
         | a site which had that kind of content policy would be one that
         | advertisers and financial institutions don't want anything to
         | do with either so they won't have the funding to adequately
         | defend themselves in court.
        
         | jedberg wrote:
         | If that came to pass, then the only recourse the companies
         | would have is to either let everything stay up (and if you
         | showdead on here you'll see what kind of garbage that is) or
         | they make every single submission sit in a queue until a human
         | can review it.
         | 
         | This site would basically die.
        
           | x86_64Ubuntu wrote:
           | But that's what people want. If they can't spread lies and
           | untruths without being fact-checked, then we all lose
           | whatever platforms are tolerable only because of moderation.
        
         | threeseed wrote:
         | > Paper media abides by this
         | 
         | It's a meaningless comparison.
         | 
         | Traditional media produce and own their content. Online media
         | is user generated.
        
           | Turing_Machine wrote:
           | > Traditional media produce and own their content.
           | 
           | Phone companies don't. The traditional choices were "have
           | full control of the content, and take full responsibility for
           | that content" (e.g., newspapers) or "don't attempt to control
           | content, and bear no responsibility for that content" (e.g.,
           | telephone companies). Newspapers were legally responsible for
           | what they published, phone companies weren't. The phone
           | company (and other so-called "common carriers") weren't held
           | responsible even if someone used the phone to plan a murder.
           | In exchange for that immunity, they were required to give
           | access to anyone with enough money to pay the monthly bill.
           | 
           | > Online media is user generated.
           | 
           | As are phone calls.
           | 
           | With Section 230, tech companies both have a) full censorship
           | powers and b) no responsibility. Absolute power with no
           | responsibility has always been a recipe for abuse.
        
           | stonogo wrote:
           | Paper media does not produce the advertisements, the
           | customers or ad agencies do that. For decades the classifieds
           | section, entirely 'user generated' was of massive import to
           | society, to the degree that sunshine laws _still_ frequently
           | require governments to publish important announcements in the
           | classified section of the local  "paper of record."
           | 
           | In other words, traditional media produce and own content,
           | but there has always been a tremendous amount of user-
           | generated media mixed in.
        
             | kelnos wrote:
             | Sure, but my impression is that readers submit classified
             | ads, someone at the paper says it's ok, and it gets
             | published in the classifieds.
             | 
             | A social media platform would not be viable if someone had
             | to approve every post before it went live. (Maybe that
             | means that social media platforms should just not exist,
             | but that's not really the specific point we're arguing
             | here.)
        
               | stonogo wrote:
               | What _is_ the specific point? That ownership of content
               | implies slow production? What if a social media network
               | established a contractor relationship with each user, and
               | paid them for their content at a rate of, say, one cent
               | per million posts, with a maximum of one hundred billable
               | posts per day?
               | 
               | The content would be coming thick and fast, the company
               | would have more standing to enforce its terms of service,
               | and you've dodged whatever moderation rules might apply
               | to user-generated content, since it's all work-for-hire
               | by our vast cadre of contractors.
               | 
               | I'm just not convinced that "traditional media own and
               | produce their content" is indicative of anything
               | particularly important, and just muddies the waters
               | surrounding the actual problems at play.
        
           | Dalewyn wrote:
           | Paper media also has user generated content. Such content has
           | been the liability of the publisher and things have worked
           | just fine for at least the last two centuries if not more.
           | 
           | Again: There is no reason internet-based media can't act like
           | their paper counterparts. The only plausible reason is
           | because nobody wants liability, but hey: You modify user
           | generated content in any way, /you the publisher/ are now
           | liable for it.
        
             | threeseed wrote:
             | I've worked for a large News Corp paper before.
             | 
             | The amount of user generated content in comments, letters
             | to the editor etc is tiny and was moderated by a single
             | person.
             | 
             | It is simply incomparable to the estimated 500 million
             | tweets a day by Twitter:
             | 
             | https://www.wordstream.com/blog/ws/2020/04/14/twitter-
             | statis...
        
               | _Algernon_ wrote:
               | Why is it my problem that their business model is stupid
               | and doesn't work when treated equally to traditional
               | publishing?
               | 
               | Equality before the law. If your business can't adapt it
               | should die.
        
               | [deleted]
        
               | cookie_monsta wrote:
               | Ahh yes, the old "we're too successful to be able to
               | operate responsibly" argument.
               | 
               | Here's a thought: there's nothing that says that Twitter
               | and friends have to exist. There was a time when being
               | able to comply with the law was part of the business
               | model, not a minor obstacle that needed to be worked
               | around.
               | 
               | As a (purely hypothetical) heroin dealer, I can't run a
               | business if I keep getting thrown in jail. What's the
               | solution?
        
               | threeseed wrote:
               | > "we're too successful to be able to operate
               | responsibly"
               | 
               | No. I am simply pointing out that comparing print media
               | to online makes no sense.
        
               | cookie_monsta wrote:
               | yes, because print media's business model allows it to
               | operate responsibly
        
               | zuminator wrote:
               | Ok, so Twitter and Facebook go out of business, only to
               | be replaced by offshore businesses outside the legal
               | reach of what replaces section 230.
               | 
               | Honestly what you've done there is present a good case
               | for why dealing heroin should be legal.
        
               | trifurcate wrote:
               | Here's a thought: I am purely hypothetically able to sell
               | some previously unknown and unforeseen (therefore legal)
               | opioid agonist that is just as addictive and dangerous as
               | heroin with no issues of legality. Or maybe it isn't even
               | an opioid agonist and has some novel mechanism of action
               | that is yet unknown.
               | 
               | What's the solution? New legislation that adapts to such
               | a threat to societal welfare. And this is what is
               | effectively happening here with litigation over Section
               | 230 et al. (worth mentioning that it's not even the case
               | that Twitter's business model involves working around the
               | law; Section 230 _is_ the law and it shields Twitter from
               | liability. So there is really no case here like you are
               | describing to begin with.)
               | 
               | The end of society and government is more or less to
               | identify and avoid bad societal outcomes, and an evolving
               | legal system is necessary to keep up with technological
               | and social changes in the world. Maybe you are correct
               | and Twitter shouldn't exist to begin with, alas Twitter
               | exists and other things like Twitter exist. They aren't
               | currently in grave violation of any law to my knowledge,
               | and any law that is to be enacted regarding things like
               | Twitter should be written in light of the current state
               | of the world, which is that Twitter exists and that
               | things like Twitter are easy to create.
        
               | ultrarunner wrote:
               | Maybe it's fitting that this thread-- in its
               | consideration of the balance between freedom and
               | benevolent force-- invokes an analogy involving a subject
               | of the war on drugs.
        
               | cookie_monsta wrote:
               | Where I live psychoactive drugs are blacklisted by law.
               | If you want to sell a new one legally you need to apply
               | to get it put on the whitelist.
               | 
               | They did this so that hypothetical you can't dodge the
               | law by tweaking molecules and that new legislation
               | doesn't have to be enacted every time you come up with a
               | new recipe.
               | 
               | I'm not saying that twitter shouldn't exist, I'm just
               | saying that it and platforms like it don't have to exist
               | in their current state. Section 230 _is_ the workaround
               | that didn't have to be created just so that the platforms
               | could operate outside of the regulations that other
               | publishers are bound by
        
             | kelnos wrote:
             | Newspapers are default-deny. The editors decide which
             | letters to the editor to publish.
             | 
             | Social media platforms are default-allow. If they were
             | default-deny, it would take months to get your newsfeed
             | posts approved.
             | 
             | Newspapers were also not designed to be a way for readers
             | to express themselves. Social media platforms exist so that
             | users can express themselves.
             | 
             | I'm not saying that social media companies are doing a good
             | job, or that their moderation practices are good, or that
             | their quest for increasing engagement above all else is
             | good. In fact I wouldn't mind if FB, Twitter, etc. just
             | disappeared tomorrow. But comparing them to newspapers is
             | just not a reasonable thing to do.
        
               | lodovic wrote:
               | With the current state of AI text processing, checking if
               | the contents of a user submitted text would be approved
               | really shouldn't take months but instead seconds.
        
               | Nasrudith wrote:
               | Sentiment analyzers which take bug reports as toxic but
               | consider beyond the pale statements like "I hope you
               | enjoy eating your family." the opposite of toxic?
        
         | smeagull wrote:
         | I guess every forum will become a general everything forum
         | following your stupid rules.
         | 
         | Hacker News will now have to keep every post that isn't illegal
         | and no communities sorted by topic will be allowed to exist.
         | 
         | Could you think next time before you share an opinion?
        
           | Dalewyn wrote:
           | A small price to pay to guarantee free speech, as far as I'm
           | concerned.
           | 
           | Hacker News is welcome to continue editorializing its
           | content, of course; it just has to accept liability for it.
        
             | ultrarunner wrote:
             | How free is speech when the only medium allowed is
             | concurrent shouting? After all the attempts by governments
             | to disorganize citizens through censorship and outright
             | internet blockages, I almost can't think of a better
             | approach than to simply make platforms so drowned in chaos
             | as to be unusable.
        
               | Dalewyn wrote:
               | >How free is speech when the only medium allowed is
               | concurrent shouting?
               | 
               | I would argue that is the /epitome/ of free speech.
        
               | PenguinCoder wrote:
               | So to you, only the most vocal users, shouting the most
               | popular things, with everything else removed or hidden,
               | is the _epitome_ of free speech?
        
               | Dalewyn wrote:
               | Speech hindered by nothing is the epitome of free speech.
               | 
               | And no, other speech is not a hinderance of your speech
               | and vice versa.
        
               | PenguinCoder wrote:
               | https://xkcd.com/1357/
               | 
               | > I can't remember where I heard this, but someone once
               | said that defending a position by citing free speech is
               | sort of the ultimate concession; you're saying that the
               | most compelling thing you can say for your position is
               | that it's not literally illegal to express.
        
               | ultrarunner wrote:
               | Especially in the current case, it's an outcome only
               | arrived at through force.
        
               | pr0zac wrote:
               | I completely disagree with your opinion but I do
               | appreciate how consistent and fully aware of the outcomes
               | consequences you are being with it.
        
         | UncleMeat wrote:
         | Do you believe that websites should be able to delete spam and
         | bot accounts?
         | 
         | Posting 10,000 links to a shitcoin ICO is not illegal.
        
       | bradwood wrote:
       | How does the outcome of a couple of court cases in one
       | jurisdiction, in one country, break the _entire, global_
       | internet?
       | 
       | This just sounds like clickbait.
        
       ___________________________________________________________________
       (page generated 2023-01-26 23:02 UTC)